2,895 research outputs found

    Effective use of data analytics and its impact on business performance within small-to-medium-sized businesses

    Get PDF
    Business use of data analytics and its potential impact on firm performance have become topics of deep interest within both the business practitioner and academic communities. While previous research has demonstrated relationships between data analytics and firm performance in larger firms, there is limited research on whether and how data analytics is used within and impacts Small-to-Medium-sized Business (SMB) settings. Given the preponderance of SMBs within the US economy, and their contribution to employment and economic activity, it is important for SMB owners to understand what management practices lead to effective use of data analytics that in turn impacts SMB performance. Drawing upon the Resource-Based View (RBV) of the firm and prior empirical research on practices within large firms, this dissertation identifies the resources that are needed to form a Data Analytics Capability (DAC) and examines the relationship between the maturity of DACs and the extent of business value realized. The research model was tested using Partial Least Squares-Structural Equation Modelling (PLS-SEM) analysis of survey data gathered from a sample of 300 SMB firms in the US, complemented with qualitative interviews of SMB owners. The results provide evidence that a more developed DAC can lead to higher Data Analytics Business Value across business functions

    The Internet of Everything

    Get PDF
    In the era before IoT, the world wide web, internet, web 2.0 and social media made people’s lives comfortable by providing web services and enabling access personal data irrespective of their location. Further, to save time and improve efficiency, there is a need for machine to machine communication, automation, smart computing and ubiquitous access to personal devices. This need gave birth to the phenomenon of Internet of Things (IoT) and further to the concept of Internet of Everything (IoE)

    ERP implementation methodologies and frameworks: a literature review

    Get PDF
    Enterprise Resource Planning (ERP) implementation is a complex and vibrant process, one that involves a combination of technological and organizational interactions. Often an ERP implementation project is the single largest IT project that an organization has ever launched and requires a mutual fit of system and organization. Also the concept of an ERP implementation supporting business processes across many different departments is not a generic, rigid and uniform concept and depends on variety of factors. As a result, the issues addressing the ERP implementation process have been one of the major concerns in industry. Therefore ERP implementation receives attention from practitioners and scholars and both, business as well as academic literature is abundant and not always very conclusive or coherent. However, research on ERP systems so far has been mainly focused on diffusion, use and impact issues. Less attention has been given to the methods used during the configuration and the implementation of ERP systems, even though they are commonly used in practice, they still remain largely unexplored and undocumented in Information Systems research. So, the academic relevance of this research is the contribution to the existing body of scientific knowledge. An annotated brief literature review is done in order to evaluate the current state of the existing academic literature. The purpose is to present a systematic overview of relevant ERP implementation methodologies and frameworks as a desire for achieving a better taxonomy of ERP implementation methodologies. This paper is useful to researchers who are interested in ERP implementation methodologies and frameworks. Results will serve as an input for a classification of the existing ERP implementation methodologies and frameworks. Also, this paper aims also at the professional ERP community involved in the process of ERP implementation by promoting a better understanding of ERP implementation methodologies and frameworks, its variety and history

    Evaluation methodology for visual analytics software

    Get PDF
    O desafio do Visual Analytics (VA) é produzir visualizações que ajudem os utilizadores a concentrarem-se no aspecto mais relevante ou mais interessante dos dados apresentados. A sociedade actual enfrenta uma quantidade de dados que aumenta rapidamente. Assim, os utilizadores de informação em todos os domínios acabam por ter mais informação do que aquela com que podem lidar. O software VA deve suportar interacções intuitivas para que os analistas possam concentrar-se na informação que estão a manipular, e não na técnica de manipulação em si. Os ambientes de VA devem procurar minimizar a carga de trabalho cognitivo global dos seus utilizadores, porque se tivermos de pensar menos nas interacções em si, teremos mais tempo para pensar na análise propriamente dita. Tendo em conta os benefícios que as aplicações VA podem trazer e a confusão que ainda existe ao identificar tais aplicações no mercado, propomos neste trabalho uma nova metodologia de avaliação baseada em heurísticas. A nossa metodologia destina-se a avaliar aplicações através de testes de usabilidade considerando as funcionalidades e características desejáveis em sistemas de VA. No entanto, devido à sua natureza quatitativa, pode ser naturalmente utilizada para outros fins, tais como comparação para decisão entre aplicações de VA do mesmo contexto. Além disso, seus critérios poderão servir como fonte de informação para designers e programadores fazerem escolhas apropriadas durante a concepção e desenvolvimento de sistemas de VA

    Personalized data analytics for internet-of-things-based health monitoring

    Get PDF
    The Internet-of-Things (IoT) has great potential to fundamentally alter the delivery of modern healthcare, enabling healthcare solutions outside the limits of conventional clinical settings. It can offer ubiquitous monitoring to at-risk population groups and allow diagnostic care, preventive care, and early intervention in everyday life. These services can have profound impacts on many aspects of health and well-being. However, this field is still at an infancy stage, and the use of IoT-based systems in real-world healthcare applications introduces new challenges. Healthcare applications necessitate satisfactory quality attributes such as reliability and accuracy due to their mission-critical nature, while at the same time, IoT-based systems mostly operate over constrained shared sensing, communication, and computing resources. There is a need to investigate this synergy between the IoT technologies and healthcare applications from a user-centered perspective. Such a study should examine the role and requirements of IoT-based systems in real-world health monitoring applications. Moreover, conventional computing architecture and data analytic approaches introduced for IoT systems are insufficient when used to target health and well-being purposes, as they are unable to overcome the limitations of IoT systems while fulfilling the needs of healthcare applications. This thesis aims to address these issues by proposing an intelligent use of data and computing resources in IoT-based systems, which can lead to a high-level performance and satisfy the stringent requirements. For this purpose, this thesis first delves into the state-of-the-art IoT-enabled healthcare systems proposed for in-home and in-hospital monitoring. The findings are analyzed and categorized into different domains from a user-centered perspective. The selection of home-based applications is focused on the monitoring of the elderly who require more remote care and support compared to other groups of people. In contrast, the hospital-based applications include the role of existing IoT in patient monitoring and hospital management systems. Then, the objectives and requirements of each domain are investigated and discussed. This thesis proposes personalized data analytic approaches to fulfill the requirements and meet the objectives of IoT-based healthcare systems. In this regard, a new computing architecture is introduced, using computing resources in different layers of IoT to provide a high level of availability and accuracy for healthcare services. This architecture allows the hierarchical partitioning of machine learning algorithms in these systems and enables an adaptive system behavior with respect to the user's condition. In addition, personalized data fusion and modeling techniques are presented, exploiting multivariate and longitudinal data in IoT systems to improve the quality attributes of healthcare applications. First, a real-time missing data resilient decision-making technique is proposed for health monitoring systems. The technique tailors various data resources in IoT systems to accurately estimate health decisions despite missing data in the monitoring. Second, a personalized model is presented, enabling variations and event detection in long-term monitoring systems. The model evaluates the sleep quality of users according to their own historical data. Finally, the performance of the computing architecture and the techniques are evaluated in this thesis using two case studies. The first case study consists of real-time arrhythmia detection in electrocardiography signals collected from patients suffering from cardiovascular diseases. The second case study is continuous maternal health monitoring during pregnancy and postpartum. It includes a real human subject trial carried out with twenty pregnant women for seven months

    Affective Computing for Emotion Detection using Vision and Wearable Sensors

    Get PDF
    The research explores the opportunities, challenges, limitations, and presents advancements in computing that relates to, arises from, or deliberately influences emotions (Picard, 1997). The field is referred to as Affective Computing (AC) and is expected to play a major role in the engineering and development of computationally and cognitively intelligent systems, processors and applications in the future. Today the field of AC is bolstered by the emergence of multiple sources of affective data and is fuelled on by developments under various Internet of Things (IoTs) projects and the fusion potential of multiple sensory affective data streams. The core focus of this thesis involves investigation into whether the sensitivity and specificity (predictive performance) of AC, based on the fusion of multi-sensor data streams, is fit for purpose? Can such AC powered technologies and techniques truly deliver increasingly accurate emotion predictions of subjects in the real world? The thesis begins by presenting a number of research justifications and AC research questions that are used to formulate the original thesis hypothesis and thesis objectives. As part of the research conducted, a detailed state of the art investigations explored many aspects of AC from both a scientific and technological perspective. The complexity of AC as a multi-sensor, multi-modality, data fusion problem unfolded during the state of the art research and this ultimately led to novel thinking and origination in the form of the creation of an AC conceptualised architecture that will act as a practical and theoretical foundation for the engineering of future AC platforms and solutions. The AC conceptual architecture developed as a result of this research, was applied to the engineering of a series of software artifacts that were combined to create a prototypical AC multi-sensor platform known as the Emotion Fusion Server (EFS) to be used in the thesis hypothesis AC experimentation phases of the research. The thesis research used the EFS platform to conduct a detailed series of AC experiments to investigate if the fusion of multiple sensory sources of affective data from sensory devices can significantly increase the accuracy of emotion prediction by computationally intelligent means. The research involved conducting numerous controlled experiments along with the statistical analysis of the performance of sensors for the purposes of AC, the findings of which serve to assess the feasibility of AC in various domains and points to future directions for the AC field. The AC experiments data investigations conducted in relation to the thesis hypothesis used applied statistical methods and techniques, and the results, analytics and evaluations are presented throughout the two thesis research volumes. The thesis concludes by providing a detailed set of formal findings, conclusions and decisions in relation to the overarching research hypothesis on the sensitivity and specificity of the fusion of vision and wearables sensor modalities and offers foresights and guidance into the many problems, challenges and projections for the AC field into the future

    Wi-Fi Long Distance Maritime Communications Data Analytics

    Get PDF
    Nowadays, wireless communications are becoming more and more important to the development of the society, not only in land, but also in the sea. When discussing about communications in maritime environments the scenario is different and harder, because of several factors, such as, the movement on the surface of the sea, the characteristics of the radio propagation and the possible intermittent obstruction that decrease the efficiency of signal propagation. Plenty of wireless communications solutions are already used in maritime environment, such as HF/VHF, which doesn't support high rates; satellite communications, which is an expensive technology and not affordable by most of users; and mobile communications (GSM, 3G and LTE), that only ensure connection near the coast. The main purpose of this dissertation is to contribute to the characterization of the propagation channel and the problems associated with the use of Wi-Fi technology for different frequencies in this kind of environment

    Using large-scale syndromic datasets to support epidemiology and surveillance

    Get PDF
    Using large-scale syndromic datasets to support epidemiology and surveillance Healthcare and the healthcare industry have traditionally produced huge amounts of data and information; patient care necessitates accurate record keeping, records of attendances and often details of the reason for contact with healthcare and outcomes.7 During the past decade, there has been a dramatic shift to digitize healthcare related information, with a view to both increasing efficiencies in these areas, and to generate new insights.8 These rich, but often unstructured data sources can present both opportunities and challenges to data scientists and epidemiologists. Syndromic surveillance (SS) is the real-time (or near real-time) collection, analysis, interpretation, and dissemination of health-related data to enable the early identification of the impact (or absence of impact) of potential human or veterinary public-health threats which require effective public-health action.9 In England, Public Health England (PHE) coordinates a suite of national real-time syndromic surveillance systems. Underpinning their operation is the collation, analysis and interpretation of large-scale datasets (“big data”). This PhD by Published Works describes work which has evaluated, developed or utilised a number of these large healthcare datasets for both surveillance and epidemiology of public health events. The thesis is divided into four themes covering critical aspects of SS. Firstly, developing SS systems using novel data sources; something which is currently under-reported in the literature. Secondly, using syndromic data systems for non-infectious disease epidemiology; understanding how these systems can inform public health insight and action outside of their original remit. Thirdly, determining the utility in identifying outbreaks which was one of the original envisioned purposes of SS, using gastrointestinal illness (GI) as a case-study. The final theme is understanding how SS is used in the context of mass gatherings; again, a key original aspect of syndromic surveillance. The thesis collates a portfolio of indexed works, all of which use (combined with other data sources) large, health-related data collated and operated by the PHE Real-Time Syndromic Surveillance Team (ReSST) and employ a range of different methodologies to translate data into public health action. These include describing the development of a novel system, observational studies and time series analysis. Key findings from the papers include; learning how to develop these systems, demonstration of their utility in non-infectious disease epidemiology, leading to new insights into the socio-demographic distribution and causes of presentations to healthcare with Allergic Rhinitis, understanding the challenges and limitations of syndromic surveillance in identifying outbreaks of GI disease and how they can be used during mass gatherings. Using diverse methodologies and data as a collective, the papers have led to significant public health impacts; both in terms of how these systems are used in England currently and how they have influenced global development of this small but growing specialit
    corecore