2,366 research outputs found

    Business Process Redesign in the Perioperative Process: A Case Perspective for Digital Transformation

    Get PDF
    This case study investigates business process redesign within the perioperative process as a method to achieve digital transformation. Specific perioperative sub-processes are targeted for re-design and digitalization, which yield improvement. Based on a 184-month longitudinal study of a large 1,157 registered-bed academic medical center, the observed effects are viewed through a lens of information technology (IT) impact on core capabilities and core strategy to yield a digital transformation framework that supports patient-centric improvement across perioperative sub-processes. This research identifies existing limitations, potential capabilities, and subsequent contextual understanding to minimize perioperative process complexity, target opportunity for improvement, and ultimately yield improved capabilities. Dynamic technological activities of analysis, evaluation, and synthesis applied to specific perioperative patient-centric data collected within integrated hospital information systems yield the organizational resource for process management and control. Conclusions include theoretical and practical implications as well as study limitations

    Modeling Clinicians’ Cognitive and Collaborative Work in Post-Operative Hospital Care

    Get PDF
    abstract: Clinicians confront formidable challenges with information management and coordination activities. When not properly integrated into clinical workflow, technologies can further burden clinicians’ cognitive resources, which is associated with medical errors and risks to patient safety. An understanding of workflow is necessary to redesign information technologies (IT) that better support clinical processes. This is particularly important in surgical care, which is among the most clinical and resource intensive settings in healthcare, and is associated with a high rate of adverse events. There are a growing number of tools to study workflow; however, few produce the kinds of in-depth analyses needed to understand health IT-mediated workflow. The goals of this research are to: (1) investigate and model workflow and communication processes across technologies and care team members in post-operative hospital care; (2) introduce a mixed-method framework, and (3) demonstrate the framework by examining two health IT-mediated tasks. This research draws on distributed cognition and cognitive engineering theories to develop a micro-analytic strategy in which workflow is broken down into constituent people, artifacts, information, and the interactions between them. It models the interactions that enable information flow across people and artifacts, and identifies dependencies between them. This research found that clinicians manage information in particular ways to facilitate planned and emergent decision-making and coordination processes. Barriers to information flow include frequent information transfers, clinical reasoning absent in documents, conflicting and redundant data across documents and applications, and that clinicians are burdened as information managers. This research also shows there is enormous variation in how clinicians interact with electronic health records (EHRs) to complete routine tasks. Variation is best evidenced by patterns that occur for only one patient case and patterns that contain repeated events. Variation is associated with the users’ experience (EHR and clinical), patient case complexity, and a lack of cognitive support provided by the system to help the user find and synthesize information. The methodology is used to assess how health IT can be improved to better support clinicians’ information management and coordination processes (e.g., context-sensitive design), and to inform how resources can best be allocated for clinician observation and training.Dissertation/ThesisDoctoral Dissertation Biomedical Informatics 201

    Addendum to Informatics for Health 2017: Advancing both science and practice

    Get PDF
    This article presents presentation and poster abstracts that were mistakenly omitted from the original publication

    Targeting Perioperative Performance Aligned to Hospital Strategy via Digital Transformation

    Get PDF
    This study examines the digital transformation of a U.S. hospital’s perioperative process, which yields targeted performance alignment to strategy. Based on a 184-month longitudinal study of a large 1,157 registered-bed academic medical center, the observed effects are viewed through a lens of information technology (IT) impact on core capabilities and core strategy. The results offer a framework that supports patient-centric improvement and targets alignment of perioperative sub-process efforts to overall hospital strategy. This research identifies existing limitations, potential capabilities, and subsequent contextual understanding to minimize perioperative process complexity, target and measure improvement, and ultimately yield process management and hospital strategy alignment. Dynamic activities of analysis, evaluation, and synthesis applied to specific perioperative patient-centric data, collected within integrated hospital information systems, provide the organizational resource for management and control. Conclusions include theoretical and practical implications as well as study limitations

    Considering Human Aspects on Strategies for Designing and Managing Distributed Human Computation

    Full text link
    A human computation system can be viewed as a distributed system in which the processors are humans, called workers. Such systems harness the cognitive power of a group of workers connected to the Internet to execute relatively simple tasks, whose solutions, once grouped, solve a problem that systems equipped with only machines could not solve satisfactorily. Examples of such systems are Amazon Mechanical Turk and the Zooniverse platform. A human computation application comprises a group of tasks, each of them can be performed by one worker. Tasks might have dependencies among each other. In this study, we propose a theoretical framework to analyze such type of application from a distributed systems point of view. Our framework is established on three dimensions that represent different perspectives in which human computation applications can be approached: quality-of-service requirements, design and management strategies, and human aspects. By using this framework, we review human computation in the perspective of programmers seeking to improve the design of human computation applications and managers seeking to increase the effectiveness of human computation infrastructures in running such applications. In doing so, besides integrating and organizing what has been done in this direction, we also put into perspective the fact that the human aspects of the workers in such systems introduce new challenges in terms of, for example, task assignment, dependency management, and fault prevention and tolerance. We discuss how they are related to distributed systems and other areas of knowledge.Comment: 3 figures, 1 tabl

    Advances in mass spectrometry-based cancer research and analysis: from cancer proteomics to clinical diagnostics

    Get PDF
    Introduction: The last 20 years have seen significant improvements in the analytical capabilities of biological mass spectrometry. Studies using advanced mass spectrometry (MS) have resulted in new insights into cell biology and the aetiology of diseases as well as its use in clinical applications. Areas Covered: This review will discuss recent developments in MS-based technologies and their cancer-related applications with a focus on proteomics. It will also discuss the issues around translating the research findings to the clinic and provide an outline of where the field is moving. Expert Opinion: Proteomics has been problematic to adapt for the clinical setting. However, MS-based techniques continue to demonstrate potential in novel clinical uses beyond classical cancer proteomics

    High-Performance Cloud Computing: A View of Scientific Applications

    Full text link
    Scientific computing often requires the availability of a massive number of computers for performing large scale experiments. Traditionally, these needs have been addressed by using high-performance computing solutions and installed facilities such as clusters and super computers, which are difficult to setup, maintain, and operate. Cloud computing provides scientists with a completely new model of utilizing the computing infrastructure. Compute resources, storage resources, as well as applications, can be dynamically provisioned (and integrated within the existing infrastructure) on a pay per use basis. These resources can be released when they are no more needed. Such services are often offered within the context of a Service Level Agreement (SLA), which ensure the desired Quality of Service (QoS). Aneka, an enterprise Cloud computing solution, harnesses the power of compute resources by relying on private and public Clouds and delivers to users the desired QoS. Its flexible and service based infrastructure supports multiple programming paradigms that make Aneka address a variety of different scenarios: from finance applications to computational science. As examples of scientific computing in the Cloud, we present a preliminary case study on using Aneka for the classification of gene expression data and the execution of fMRI brain imaging workflow.Comment: 13 pages, 9 figures, conference pape
    • 

    corecore