24 research outputs found

    Patient Safety Workshop: A Graduate Medical Education Interprofessional Simulation Half-Day

    Get PDF
    Introduction: As per the National Academy of Medicine, patient safety is considered indistinguishable from the delivery of quality health care, and is referred to as the foundation upon which all other aspects of quality care are built. Throughout the years, graduate medical education (GME) across the world has evolved to ensure the training of future medical professionals includes exposure to many of the elements that compose patient safety, such as implementing root cause analysis, systems thinking, and disclosing adverse events. University of Texas Rio Grande Valley (UTRGV) is the sponsoring institution for 19 GME programs across different specialties. As part of the orientation for their respective residencies, the GME office designed a workshop to introduce patient safety concepts and skills. We conducted a quality improvement project to assess the workshop and identify strengths and areas for improvement. Methods: The GME Office, with chief residents, program leaders and pharmacy faculty, developed and delivered a multidisciplinary simulation workshop involving internal medicine, family medicine, general surgery, psychiatry, and pharmacy residents spanning 4 hours during resident orientation period June 21-23, 2022. We created groups of 4-6 learners with mixed disciplines. Interventions focused on: (1) root cause analysis; (2) disclosure of patient safety events; (3) identifying patient safety hazards in the inpatient setting; (4) interdisciplinary communication skills. The workshop will include the use of small-group discussion, mannequins, reflection and role modeling. Participants will complete an anonymous pre and post survey to determine the effect of the workshop and seek improvements in GME at UTRGV. Discussion: Patient safety training and education of health care professionals have not kept pace with advances in patient safety or workforce requirements. Internal and national surveys show that residencies struggle to meet competencies in patient safety, quality improvement, and accountability as required by the ACGME. This inaugural UTRGV GME Office patient safety interprofessional simulation workshop attempts to address and enhance the knowledge and confidence surrounding patient safety concepts. Results will include and pre and post survey evaluations of the workshop and identify next steps

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Geochemical evolution of the Quaternary Chachimbiro Volcanic Complex (frontal volcanic arc of Ecuador)

    No full text
    The Chachimbiro volcanic complex (CVC) is a composite volcano located in the frontal part of the Ecuadorian Quaternary Arc. Four periods of activity can be differentiated in Chachimbiro ~400 ka-long volcanic history, named CH1 (405.7 ± 20.0e298.6 ± 32.9 ka), CH2 (121.75 ± 23.2e36.08 ± 2.8 ka), CH3 (36080 ± 280e22730 ± 120 ybp) and CH4 (5.760 ± 30 ybp) respectively. The magmatic suite ranges in composition from andesites to rhyodacites, and displays systematic increases of SiO2, Sr/Y, La/Yb, incompatible elements (e.g., LREE, MREE, LILE) and depletion of HREE with time. These geochemical features, coupled with radiogenic isotope data and textural observations (e.g., several resorption and overgrowth cycles in plagioclase phenocrysts), suggest an open system evolution for CVC including typical crustal magmatic processes such as recharge, assimilation and fractional crystallisation. To better constrain the magmatic crustal processes of the CVC, we use a multi-model approach including three open system models: RFC (Recharge, Fractional Crystallisation), AFC (Assimilation, Fractional Crystallisation), and EC-RAFC (Energy Constrained e Recharge Assimilation, Fractional Crys- tallisation). Different simulations have been developed to best fit CVC’s geochemistry (including trace elements, Sr/Y ratios and Sr isotopes), which suggest predominant fractional crystallisation and variable assimilation as the main processes explaining the bulk of the geochemical data of CVC. This is consistent with an overall thermal decline of the magmatic system through time. The genetic model here presented involves the input of a magmatic recharge component matured in the low-mid crust through dominant fractionation of amphibole (±garnet), into intermediate crustal reservoir(s) where magma undergoes extensive FC

    Heme oxygenase–1 and carbon monoxide suppress autoimmune neuroinflammation

    No full text
    Heme oxygenase–1 (HO-1, encoded by HMOX1) dampens inflammatory reactions via the catabolism of heme into CO, Fe, and biliverdin. We report that expression of HO-1 dictates the pathologic outcome of experimental autoimmune encephalomyelitis (EAE), a model of multiple sclerosis (MS). Induction of EAE in Hmox1(–/– )C57BL/6 mice led to enhanced CNS demyelination, paralysis, and mortality, as compared with Hmox1(+/+) mice. Induction of HO-1 by cobalt protoporphyrin IX (CoPPIX) administration after EAE onset reversed paralysis in C57BL/6 and SJL/J mice and disease relapse in SJL/J mice. These effects were not observed using zinc protoporphyrin IX, which does not induce HO-1. CoPPIX protection was abrogated in Hmox1(–/–) C57BL/6 mice, indicating that CoPPIX acts via HO-1 to suppress EAE progression. The protective effect of HO-1 was associated with inhibition of MHC class II expression by APCs and inhibition of Th and CD8 T cell accumulation, proliferation, and effector function within the CNS. Exogenous CO mimicked these effects, suggesting that CO contributes to the protective action of HO-1. In conclusion, HO-1 or exposure to its end product CO counters autoimmune neuroinflammation and thus might be used therapeutically to treat MS

    Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database

    Get PDF
    Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p &lt; 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p &lt; 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    Weaning from mechanical ventilation in intensive care units across 50 countries (WEAN SAFE): a multicentre, prospective, observational cohort study

    No full text
    Background: Current management practices and outcomes in weaning from invasive mechanical ventilation are poorly understood. We aimed to describe the epidemiology, management, timings, risk for failure, and outcomes of weaning in patients requiring at least 2 days of invasive mechanical ventilation. Methods: WEAN SAFE was an international, multicentre, prospective, observational cohort study done in 481 intensive care units in 50 countries. Eligible participants were older than 16 years, admitted to a participating intensive care unit, and receiving mechanical ventilation for 2 calendar days or longer. We defined weaning initiation as the first attempt to separate a patient from the ventilator, successful weaning as no reintubation or death within 7 days of extubation, and weaning eligibility criteria based on positive end-expiratory pressure, fractional concentration of oxygen in inspired air, and vasopressors. The primary outcome was the proportion of patients successfully weaned at 90 days. Key secondary outcomes included weaning duration, timing of weaning events, factors associated with weaning delay and weaning failure, and hospital outcomes. This study is registered with ClinicalTrials.gov, NCT03255109. Findings: Between Oct 4, 2017, and June 25, 2018, 10 232 patients were screened for eligibility, of whom 5869 were enrolled. 4523 (77·1%) patients underwent at least one separation attempt and 3817 (65·0%) patients were successfully weaned from ventilation at day 90. 237 (4·0%) patients were transferred before any separation attempt, 153 (2·6%) were transferred after at least one separation attempt and not successfully weaned, and 1662 (28·3%) died while invasively ventilated. The median time from fulfilling weaning eligibility criteria to first separation attempt was 1 day (IQR 0-4), and 1013 (22·4%) patients had a delay in initiating first separation of 5 or more days. Of the 4523 (77·1%) patients with separation attempts, 2927 (64·7%) had a short wean (≤1 day), 457 (10·1%) had intermediate weaning (2-6 days), 433 (9·6%) required prolonged weaning (≥7 days), and 706 (15·6%) had weaning failure. Higher sedation scores were independently associated with delayed initiation of weaning. Delayed initiation of weaning and higher sedation scores were independently associated with weaning failure. 1742 (31·8%) of 5479 patients died in the intensive care unit and 2095 (38·3%) of 5465 patients died in hospital. Interpretation: In critically ill patients receiving at least 2 days of invasive mechanical ventilation, only 65% were weaned at 90 days. A better understanding of factors that delay the weaning process, such as delays in weaning initiation or excessive sedation levels, might improve weaning success rates. Funding: European Society of Intensive Care Medicine, European Respiratory Society

    Weaning from mechanical ventilation in intensive care units across 50 countries (WEAN SAFE): a multicentre, prospective, observational cohort study

    No full text
    Background Current management practices and outcomes in weaning from invasive mechanical ventilation are poorly understood. We aimed to describe the epidemiology, management, timings, risk for failure, and outcomes of weaning in patients requiring at least 2 days of invasive mechanical ventilation. Methods WEAN SAFE was an international, multicentre, prospective, observational cohort study done in 481 intensive care units in 50 countries. Eligible participants were older than 16 years, admitted to a participating intensive care unit, and receiving mechanical ventilation for 2 calendar days or longer. We defined weaning initiation as the first attempt to separate a patient from the ventilator, successful weaning as no reintubation or death within 7 days of extubation, and weaning eligibility criteria based on positive end-expiratory pressure, fractional concentration of oxygen in inspired air, and vasopressors. The primary outcome was the proportion of patients successfully weaned at 90 days. Key secondary outcomes included weaning duration, timing of weaning events, factors associated with weaning delay and weaning failure, and hospital outcomes. This study is registered with ClinicalTrials.gov, NCT03255109. Findings Between Oct 4, 2017, and June 25, 2018, 10 232 patients were screened for eligibility, of whom 5869 were enrolled. 4523 (77·1%) patients underwent at least one separation attempt and 3817 (65·0%) patients were successfully weaned from ventilation at day 90. 237 (4·0%) patients were transferred before any separation attempt, 153 (2·6%) were transferred after at least one separation attempt and not successfully weaned, and 1662 (28·3%) died while invasively ventilated. The median time from fulfilling weaning eligibility criteria to first separation attempt was 1 day (IQR 0–4), and 1013 (22·4%) patients had a delay in initiating first separation of 5 or more days. Of the 4523 (77·1%) patients with separation attempts, 2927 (64·7%) had a short wean (≤1 day), 457 (10·1%) had intermediate weaning (2–6 days), 433 (9·6%) required prolonged weaning (≥7 days), and 706 (15·6%) had weaning failure. Higher sedation scores were independently associated with delayed initiation of weaning. Delayed initiation of weaning and higher sedation scores were independently associated with weaning failure. 1742 (31·8%) of 5479 patients died in the intensive care unit and 2095 (38·3%) of 5465 patients died in hospital. Interpretation In critically ill patients receiving at least 2 days of invasive mechanical ventilation, only 65% were weaned at 90 days. A better understanding of factors that delay the weaning process, such as delays in weaning initiation or excessive sedation levels, might improve weaning success rates

    Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis

    No full text
    International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine
    corecore