4,893 research outputs found

    Arduous implementation: Does the Normalisation Process Model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice

    Get PDF
    Background: decision support technologies (DSTs, also known as decision aids) help patients and professionals take part in collaborative decision-making processes. Trials have shown favorable impacts on patient knowledge, satisfaction, decisional conflict and confidence. However, they have not become routinely embedded in health care settings. Few studies have approached this issue using a theoretical framework. We explained problems of implementing DSTs using the Normalization Process Model, a conceptual model that focuses attention on how complex interventions become routinely embedded in practice.Methods: the Normalization Process Model was used as the basis of conceptual analysis of the outcomes of previous primary research and reviews. Using a virtual working environment we applied the model and its main concepts to examine: the 'workability' of DSTs in professional-patient interactions; how DSTs affect knowledge relations between their users; how DSTs impact on users' skills and performance; and the impact of DSTs on the allocation of organizational resources.Results: conceptual analysis using the Normalization Process Model provided insight on implementation problems for DSTs in routine settings. Current research focuses mainly on the interactional workability of these technologies, but factors related to divisions of labor and health care, and the organizational contexts in which DSTs are used, are poorly described and understood.Conclusion: the model successfully provided a framework for helping to identify factors that promote and inhibit the implementation of DSTs in healthcare and gave us insights into factors influencing the introduction of new technologies into contexts where negotiations are characterized by asymmetries of power and knowledge. Future research and development on the deployment of DSTs needs to take a more holistic approach and give emphasis to the structural conditions and social norms in which these technologies are enacte

    Research in Business Process Management: A bibliometric analysis

    Get PDF
    It contains several growing subtopics such as process mining, process flexibility and process compliance. BPM is also highly relevant for numerous related fields, such as Business Intelligence, ERP systems or Knowledge Management. The growing number of publications and the variety of topics in BPM make it useful to apply bibliometric methods on this scientific field. With bibliometric methods, topical clusters, essential authors and the relationships between them can be discovered. In this work, the BibTechMon software from the Austrian Institute of Technology is utilized to perform the bibliometric analyses. As a novelty for the work with BibTechMon, data from Google Scholar is used as the basis of the analyses. The nature of Google Scholar data differs significantly from the data of other scientific databases. These differences lead to changes on how the bibliometric analyses can be performed. After these changes have been assessed, several bibliometric analyses in the BPM field and related fields are performed. As a result of these analyses, diverse topical clusters in BPM and its related fields could be discovered. Additionally, important authors for each cluster and for the BPM field as a whole were determined. In order to evaluate the results of the bibliometric analyses, I conducted an interview on BPM with Professor Reichert, who is an active researcher in the field. Subsequently, his statements are compared with the results of the bibliometric analyses and the match between the bibliometric analyses and his statements is assessed

    Applied business analytics approach to IT projects – Methodological framework

    Full text link
    The design and implementation of a big data project differs from a typical business intelligence project that might be presented concurrently within the same organization. A big data initiative typically triggers a large scale IT project that is expected to deliver the desired outcomes. The industry has identified two major methodologies for running a data centric project, in particular SEMMA (Sample, Explore, Modify, Model and Assess) and CRISP-DM (Cross Industry Standard Process for Data Mining). More general, the professional organizations PMI (Project Management Institute) and IIBA (International Institute of Business Analysis) have defined their methods for project management and business analysis based on the best current industry practices. However, big data projects place new challenges that are not considered by the existing methodologies. The building of end-to-end big data analytical solution for optimization of the supply chain, pricing and promotion, product launch, shop potential and customer value is facing both business and technical challenges. The most common business challenges are unclear and/or poorly defined business cases; irrelevant data; poor data quality; overlooked data granularity; improper contextualization of data; unprepared or bad prepared data; non-meaningful results; lack of skill set. Some of the technical challenges are related to lag of resources and technology limitations; availability of data sources; storage difficulties; security issues; performance problems; little flexibility; and ineffective DevOps. This paper discusses an applied business analytics approach to IT projects and addresses the above-described aspects. The authors present their work on research and development of new methodological framework and analytical instruments applicable in both business endeavors, and educational initiatives, targeting big data. The proposed framework is based on proprietary methodology and advanced analytics tools. It is focused on the development and the implementation of practical solutions for project managers, business analysts, IT practitioners and Business/Data Analytics students. Under discussion are also the necessary skills and knowledge for the successful big data business analyst, and some of the main organizational and operational aspects of the big data projects, including the continuous model deployment

    Automation and Adaptation: Information Technology, Work Practices, and Labor Demand at Three Firms

    Get PDF
    The use of information and communication technology to automate routine tasks involves two types of innovation: technological and organizational. Together, improvements in technological capabilities and complementary changes made by firms in the way they organize work and implement work practices constitute the conditions under which machines substitute for or complement human workers. Building on the prevailing model of routine-biased technical change and recent insights into organizational complementarities, I conduct three qualitative case studies in health care and real estate to assess the relationship between technology and firm-level labor demand. Unique combinations of technological innovation, organizational complementarity, and decision-making at each firm produce differential impacts for labor demand, with even similar technologies exhibiting quite different patterns of substitution for workers of all skill types. In addition, studying firm-level complementarities illuminates how and why the scope of the routine task may be growing, with particularly important implications for relatively higher skill workers
    • …
    corecore