488 research outputs found

    Improving follow-up of abnormal cancer screens using electronic health records: trust but verify test result communication

    Get PDF
    BACKGROUND: Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs) remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. METHODS: Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT) pre- and post-intervention. RESULTS: Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs) of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p\u3c0.01) and was sustained at month 4 following the intervention. CONCLUSION: Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems

    Improving outpatient safety through effective electronic communication: a study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health information technology and electronic medical records (EMRs) are potentially powerful systems-based interventions to facilitate diagnosis and treatment because they ensure the delivery of key new findings and other health related information to the practitioner. However, effective communication involves more than just information transfer; despite a state of the art EMR system, communication breakdowns can still occur. <abbrgrp><abbr bid="B1">1</abbr><abbr bid="B2">2</abbr><abbr bid="B3">3</abbr></abbrgrp> In this project, we will adapt a model developed by the Systems Engineering Initiative for Patient Safety (SEIPS) to understand and improve the relationship between work systems and processes of care involved with electronic communication in EMRs. We plan to study three communication activities in the Veterans Health Administration's (VA) EMR: electronic communication of abnormal imaging and laboratory test results via automated notifications (<it>i.e.</it>, alerts); electronic referral requests; and provider-to-pharmacy communication via computerized provider order entry (CPOE).</p> <p>Aim</p> <p>Our specific aim is to propose a protocol to evaluate the systems and processes affecting outcomes of electronic communication in the computerized patient record system (related to diagnostic test results, electronic referral requests, and CPOE prescriptions) using a human factors engineering approach, and hence guide the development of interventions for work system redesign.</p> <p>Design</p> <p>This research will consist of multiple qualitative methods of task analysis to identify potential sources of error related to diagnostic test result alerts, electronic referral requests, and CPOE; this will be followed by a series of focus groups to identify barriers, facilitators, and suggestions for improving the electronic communication system. Transcripts from all task analyses and focus groups will be analyzed using methods adapted from grounded theory and content analysis.</p

    Understanding the management of electronic test result notifications in the outpatient setting

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Notifying clinicians about abnormal test results through electronic health record (EHR) -based "alert" notifications may not always lead to timely follow-up of patients. We sought to understand barriers, facilitators, and potential interventions for safe and effective management of abnormal test result delivery via electronic alerts.</p> <p>Methods</p> <p>We conducted a qualitative study consisting of six 6-8 member focus groups (N = 44) at two large, geographically dispersed Veterans Affairs facilities. Participants included full-time primary care providers, and personnel representing diagnostic services (radiology, laboratory) and information technology. We asked participants to discuss barriers, facilitators, and suggestions for improving timely management and follow-up of abnormal test result notifications and encouraged them to consider technological issues, as well as broader, human-factor-related aspects of EHR use such as organizational, personnel, and workflow.</p> <p>Results</p> <p>Providers reported receiving a large number of alerts containing information unrelated to abnormal test results, many of which were believed to be unnecessary. Some providers also reported lacking proficiency in use of certain EHR features that would enable them to manage alerts more efficiently. Suggestions for improvement included improving display and tracking processes for critical alerts in the EHR, redesigning clinical workflow, and streamlining policies and procedures related to test result notification.</p> <p>Conclusion</p> <p>Providers perceive several challenges for fail-safe electronic communication and tracking of abnormal test results. A multi-dimensional approach that addresses technology as well as the many non-technological factors we elicited is essential to design interventions to reduce missed test results in EHRs.</p

    Efficient Project Delivery Using Lean Principles - An Indian Case Study

    Get PDF
    Construction industry in India is growing at a rapid pace. Along with this growth, the industry is facing numerous challenges that are making delivery of projects inefficient. Experts believe that capacity constraints in the industry need to be addressed immediately. Government has recommended ‘introduction of efficient technologies and modern management techniques’ to increase the productivity of the industry. In this context, lean principles can act as a lever to make project delivery more efficient and provide the much needed impetus to the Indian construction sector. Around the globe lean principles are showing positive results on the projects. Project teams are reporting improvements in construction time, cost and quality along with softer benefits of enhanced collaboration, coordination and trust in project teams. Can adoption of lean principles provide similar benefits in the Indian construction sector? This research was conducted to answer this question. Using an action research approach a key lean construction tool called Last Planner System (LPS) was tested on a large Indian construction project. The work described in this work investigates the improvements achieved in project delivery by adopting LPS in Indian construction sector. Comparison in pre- and post-implementation data demonstrates increase in the certainty of work-flow and improves schedule compliance. This is measured through a simple LPS metric called percent plan complete. Explicit improvements in schedule performance are seen during 8 week LPS implementation along with implicit improvements in coordination, collaboration and trust in the project team. This work reports the findings of LPS implementation on the case study project outlining the barriers and drivers to adoption, strategies needed to ensure successful implementation and roadmap for implementation. Based on the findings the authors envision that lean construction can make project delivery more efficient in India

    Effects of combined renin-angiotensin-aldosterone system inhibitor and beta-blocker treatment on outcomes in heart failure with reduced ejection fraction:insights from BIOSTAT-CHF and ASIAN-HF registries

    Get PDF
    Background: Angiotensin‐converting enzyme inhibitors (ACEi)/angiotensin receptor blockers (ARB) and β‐blockers are guideline‐recommended first‐line therapies in heart failure (HF) with reduced ejection fraction (HFrEF). Previous studies showed that individual drug classes were under‐dosed in many parts of Europe and Asia. In this study, we investigated the association of combined up‐titration of ACEi/ARBs and β‐blockers with all‐cause mortality and its combination with hospitalization for HF. Methods and results: A total of 6787 HFrEF patients (mean age 62.6 ± 13.2 years, 77.7% men, mean left ventricular ejection fraction 27.7 ± 7.2%) were enrolled in the prospective multinational European (BIOSTAT‐CHF; n = 2100) and Asian (ASIAN‐HF; n = 4687) studies. Outcomes were analysed according to achieved percentage of guideline‐recommended target doses (GRTD) of combination ACEi/ARB and β‐blocker therapy, adjusted for indication bias. Only 14% (n = 981) patients achieved ≥50% GRTD for both ACEi/ARB and β‐blocker. The best outcomes were observed in patients who achieved 100% GRTD of both ACEi/ARB and β‐blocker [hazard ratio (HR) 0.32, 95% confidence interval (CI) 0.26–0.39 vs. none]. Lower dose of combined therapy was associated with better outcomes than 100% GRTD of either monotherapy. Up‐titrating β‐blockers was associated with a consistent and greater reduction in hazards of all‐cause mortality (HR for 100% GRTD: 0.40, 95% CI 0.25–0.63) than corresponding ACEi/ARB up‐titration (HR 0.75, 95% CI 0.53–1.07). Conclusion: This study shows that best outcomes were observed in patients attaining GRTD for both ACEi/ARB and β‐blockers, unfortunately this was rarely achieved. Achieving &gt;50% GRTD of both drug classes was associated with better outcome than target dose of monotherapy. Up‐titrating β‐blockers to target dose was associated with greater mortality reduction than up‐titrating ACEi/ARB

    Global, regional, and national burden of chronic kidney disease, 1990–2017 : a systematic analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background Health system planning requires careful assessment of chronic kidney disease (CKD) epidemiology, but data for morbidity and mortality of this disease are scarce or non-existent in many countries. We estimated the global, regional, and national burden of CKD, as well as the burden of cardiovascular disease and gout attributable to impaired kidney function, for the Global Burden of Diseases, Injuries, and Risk Factors Study 2017. We use the term CKD to refer to the morbidity and mortality that can be directly attributed to all stages of CKD, and we use the term impaired kidney function to refer to the additional risk of CKD from cardiovascular disease and gout. Methods The main data sources we used were published literature, vital registration systems, end-stage kidney disease registries, and household surveys. Estimates of CKD burden were produced using a Cause of Death Ensemble model and a Bayesian meta-regression analytical tool, and included incidence, prevalence, years lived with disability, mortality, years of life lost, and disability-adjusted life-years (DALYs). A comparative risk assessment approach was used to estimate the proportion of cardiovascular diseases and gout burden attributable to impaired kidney function. Findings Globally, in 2017, 1·2 million (95% uncertainty interval [UI] 1·2 to 1·3) people died from CKD. The global all-age mortality rate from CKD increased 41·5% (95% UI 35·2 to 46·5) between 1990 and 2017, although there was no significant change in the age-standardised mortality rate (2·8%, −1·5 to 6·3). In 2017, 697·5 million (95% UI 649·2 to 752·0) cases of all-stage CKD were recorded, for a global prevalence of 9·1% (8·5 to 9·8). The global all-age prevalence of CKD increased 29·3% (95% UI 26·4 to 32·6) since 1990, whereas the age-standardised prevalence remained stable (1·2%, −1·1 to 3·5). CKD resulted in 35·8 million (95% UI 33·7 to 38·0) DALYs in 2017, with diabetic nephropathy accounting for almost a third of DALYs. Most of the burden of CKD was concentrated in the three lowest quintiles of Socio-demographic Index (SDI). In several regions, particularly Oceania, sub-Saharan Africa, and Latin America, the burden of CKD was much higher than expected for the level of development, whereas the disease burden in western, eastern, and central sub-Saharan Africa, east Asia, south Asia, central and eastern Europe, Australasia, and western Europe was lower than expected. 1·4 million (95% UI 1·2 to 1·6) cardiovascular disease-related deaths and 25·3 million (22·2 to 28·9) cardiovascular disease DALYs were attributable to impaired kidney function. Interpretation Kidney disease has a major effect on global health, both as a direct cause of global morbidity and mortality and as an important risk factor for cardiovascular disease. CKD is largely preventable and treatable and deserves greater attention in global health policy decision making, particularly in locations with low and middle SDI

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    SummaryBackground The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding Bill & Melinda Gates Foundation
    corecore