154,762 research outputs found

    Improving Quality and Achieving Equity: A Guide for Hospital Leaders

    Get PDF
    Outlines the need to address racial/ethnic disparities in health care, highlights model practices, and makes step-by-step recommendations on creating a committee, collecting data, setting quality measures, evaluating, and implementing new strategies

    Prescriptions for Excellence in Health Care Summer 2008 Download Full Issue #4

    Get PDF

    Big Data and the Internet of Things

    Full text link
    Advances in sensing and computing capabilities are making it possible to embed increasing computing power in small devices. This has enabled the sensing devices not just to passively capture data at very high resolution but also to take sophisticated actions in response. Combined with advances in communication, this is resulting in an ecosystem of highly interconnected devices referred to as the Internet of Things - IoT. In conjunction, the advances in machine learning have allowed building models on this ever increasing amounts of data. Consequently, devices all the way from heavy assets such as aircraft engines to wearables such as health monitors can all now not only generate massive amounts of data but can draw back on aggregate analytics to "improve" their performance over time. Big data analytics has been identified as a key enabler for the IoT. In this chapter, we discuss various avenues of the IoT where big data analytics either is already making a significant impact or is on the cusp of doing so. We also discuss social implications and areas of concern.Comment: 33 pages. draft of upcoming book chapter in Japkowicz and Stefanowski (eds.) Big Data Analysis: New algorithms for a new society, Springer Series on Studies in Big Data, to appea

    Committed to Safety: Ten Case Studies on Reducing Harm to Patients

    Get PDF
    Presents case studies of healthcare organizations, clinical teams, and learning collaborations to illustrate successful innovations for improving patient safety nationwide. Includes actions taken, results achieved, lessons learned, and recommendations

    E-infrastructures fostering multi-centre collaborative research into the intensive care management of patients with brain injury

    Get PDF
    Clinical research is becoming ever more collaborative with multi-centre trials now a common practice. With this in mind, never has it been more important to have secure access to data and, in so doing, tackle the challenges of inter-organisational data access and usage. This is especially the case for research conducted within the brain injury domain due to the complicated multi-trauma nature of the disease with its associated complex collation of time-series data of varying resolution and quality. It is now widely accepted that advances in treatment within this group of patients will only be delivered if the technical infrastructures underpinning the collection and validation of multi-centre research data for clinical trials is improved. In recognition of this need, IT-based multi-centre e-Infrastructures such as the Brain Monitoring with Information Technology group (BrainIT - www.brainit.org) and Cooperative Study on Brain Injury Depolarisations (COSBID - www.cosbid.de) have been formed. A serious impediment to the effective implementation of these networks is access to the know-how and experience needed to install, deploy and manage security-oriented middleware systems that provide secure access to distributed hospital based datasets and especially the linkage of these data sets across sites. The recently funded EU framework VII ICT project Advanced Arterial Hypotension Adverse Event prediction through a Novel Bayesian Neural Network (AVERT-IT) is focused upon tackling these challenges. This chapter describes the problems inherent to data collection within the brain injury medical domain, the current IT-based solutions designed to address these problems and how they perform in practice. We outline how the authors have collaborated towards developing Grid solutions to address the major technical issues. Towards this end we describe a prototype solution which ultimately formed the basis for the AVERT-IT project. We describe the design of the underlying Grid infrastructure for AVERT-IT and how it will be used to produce novel approaches to data collection, data validation and clinical trial design is also presented

    Utility of Fear Severity and Individual Resilience Scoring as a Surge Capacity, Triage Management Tool during Large-Scale, Bio-event Disasters

    Get PDF
    Threats of bioterrorism and emerging infectious disease pandemics may result in fear related consequences. Fear based signs and symptoms, if left undetected and untreated, may be extremely debilitating and lead to chronic problems with risk of permanent damage to the brain’s locus coeruleus stress response circuits. The triage management of susceptible, exposed, and infectious victims seeking care must be sensitive and specific enough to identify individuals with excessive levels of fear in order to address the nuances of fear-based symptoms at the initial point of contact. These acute conditions, which include hyper-vigilant fear, are best managed by timely and effective information, rapid evaluation, and possibly medication that uniquely addresses the locus-coeruleus driven noradrenalin overactivation. This article recommends that a fear and resilience (FR) checklist be included as an essential triage tool to identify those most at risk. This checklist has the utility of rapid usage and capacity to respond to limitations brought about by surge capacity requirements. Whereas the utility of such a checklist is evident, predictive validity studies will be required in the future. It is important to note that a unique feature of the FR Checklist is that in addition to identifying individuals who are emotionally, medically, and socially hypo-resilient, it simultaneously identifies individuals who are hyper-resilient who can be asked to volunteer and thus rapidly expand the surge capacity

    Automated Measurement of Adherence to Traumatic Brain Injury (TBI) Guidelines using Neurological ICU Data

    Get PDF
    Using a combination of physiological and treatment information from neurological ICU data-sets, adherence to traumatic brain injury (TBI) guidelines on hypotension, intracranial pressure (ICP) and cerebral perfusion pressure (CPP) is calculated automatically. The ICU output is evaluated to capture pressure events and actions taken by clinical staff for patient management, and are then re-expressed as simplified process models. The official TBI guidelines from the Brain Trauma Foundation are similarly evaluated, so the two structures can be compared and a quantifiable distance between the two calculated (the measure of adherence). The methods used include: the compilation of physiological and treatment information into event logs and subsequently process models; the expression of the BTF guidelines in process models within the real-time context of the ICU; a calculation of distance between the two processes using two algorithms (“Direct” and “Weighted”) building on work conducted in th e business process domain. Results are presented across two categories each with clinical utility (minute-by-minute and single patient stays) using a real ICU data-set. Results of two sample patients using a weighted algorithm show a non-adherence level of 6.25% for 42 mins and 56.25% for 708 mins and non-adherence of 18.75% for 17 minutes and 56.25% for 483 minutes. Expressed as two combinatorial metrics (duration/non-adherence (A) and duration * non-adherence (B)), which together indicate the clinical importance of the non-adherence, one has a mean of A=4.63 and B=10014.16 and the other a mean of A=0.43 and B=500.0
    • 

    corecore