32 research outputs found

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Prevalence, associated factors and outcomes of pressure injuries in adult intensive care unit patients: the DecubICUs study

    Get PDF
    Funder: European Society of Intensive Care Medicine; doi: http://dx.doi.org/10.13039/501100013347Funder: Flemish Society for Critical Care NursesAbstract: Purpose: Intensive care unit (ICU) patients are particularly susceptible to developing pressure injuries. Epidemiologic data is however unavailable. We aimed to provide an international picture of the extent of pressure injuries and factors associated with ICU-acquired pressure injuries in adult ICU patients. Methods: International 1-day point-prevalence study; follow-up for outcome assessment until hospital discharge (maximum 12 weeks). Factors associated with ICU-acquired pressure injury and hospital mortality were assessed by generalised linear mixed-effects regression analysis. Results: Data from 13,254 patients in 1117 ICUs (90 countries) revealed 6747 pressure injuries; 3997 (59.2%) were ICU-acquired. Overall prevalence was 26.6% (95% confidence interval [CI] 25.9–27.3). ICU-acquired prevalence was 16.2% (95% CI 15.6–16.8). Sacrum (37%) and heels (19.5%) were most affected. Factors independently associated with ICU-acquired pressure injuries were older age, male sex, being underweight, emergency surgery, higher Simplified Acute Physiology Score II, Braden score 3 days, comorbidities (chronic obstructive pulmonary disease, immunodeficiency), organ support (renal replacement, mechanical ventilation on ICU admission), and being in a low or lower-middle income-economy. Gradually increasing associations with mortality were identified for increasing severity of pressure injury: stage I (odds ratio [OR] 1.5; 95% CI 1.2–1.8), stage II (OR 1.6; 95% CI 1.4–1.9), and stage III or worse (OR 2.8; 95% CI 2.3–3.3). Conclusion: Pressure injuries are common in adult ICU patients. ICU-acquired pressure injuries are associated with mainly intrinsic factors and mortality. Optimal care standards, increased awareness, appropriate resource allocation, and further research into optimal prevention are pivotal to tackle this important patient safety threat

    Let’s Talk About Sepsis

    No full text

    Nociception Control of Bilateral Single-Shot Erector Spinae Plane Block Compared to No Block in Open Heart Surgery—A Post Hoc Analysis of the NESP Randomized Controlled Clinical Trial

    No full text
    Background and Objectives: The erector spinae plane block (ESPB) is an analgesic adjunct demonstrated to reduce intraoperative opioid consumption within a Nociception Level (NOL) index-directed anesthetic protocol. We aimed to examine the ESPB effect on the quality of intraoperative nociception control evaluated with the NOL index. Materials and Methods: This is a post hoc analysis of the NESP (Nociception Level Index-Directed Erector Spinae Plane Block in Open Heart Surgery) randomized controlled trial. Eighty-five adult patients undergoing on-pump cardiac surgery were allocated to group 1 (Control, n = 43) and group 2 (ESPB, n = 42). Both groups received general anesthesia. Preoperatively, group 2 received bilateral single-shot ESPB (1.5 mg/kg/side 0.5% ropivacaine mixed with dexamethasone 8 mg/20 mL). Until cardiopulmonary bypass (CPB) was initiated, fentanyl administration was individualized using the NOL index. The NOL index was compared at five time points: pre-incision (T1), post-incision (T2), pre-sternotomy (T3), post-sternotomy (T4), and pre-CPB (T5). On a scale from 0 (no nociception) to 100 (extreme nociception), a NOL index > 25 was considered an inadequate response to noxious stimuli. Results: The average NOL index across the five time points in group 2 to group 1 was 12.78 ± 0.8 vs. 24.18 ± 0.79 (p p 25 at T2 (4.7% vs. 79%), T3 (0% vs. 37.2%), and T4 (7.1% vs. 79%) (p Conclusions: The addition of bilateral single-shot ESPB to general anesthesia during cardiac surgery improved the quality of intraoperative nociception control according to a NOL index-based evaluation

    Nociception Level Index-Directed Erector Spinae Plane Block in Open Heart Surgery: A Randomized Controlled Clinical Trial

    No full text
    Background and Objectives: The erector spinae plane block (ESPB) is a multimodal opioid-sparing component, providing chest-wall analgesia of variable extent, duration, and intensity. The objective was to examine the ESPB effect on perioperative opioid usage and postoperative rehabilitation when used within a Nociception Level (NOL) index-directed anesthetic protocol. Materials and Methods: This prospective, randomized, controlled, open-label study was performed in adult patients undergoing on-pump cardiac surgery in a single tertiary hospital. Eighty-three adult patients who met eligibility criteria were randomly allocated to group 1 (Control, n = 43) and group 2 (ESPB, n = 40) and received general anesthesia with NOL index-directed fentanyl dosing. Preoperatively, group 2 also received bilateral single-shot ultrasound-guided ESPB (1.5 mg/kg/side 0.5% ropivacaine mixed with dexamethasone 8 mg/20 mL). Postoperatively, both groups received intravenous paracetamol (1 g every 6 h). Morphine (0.03 mg/kg) was administered for numeric rating scale (NRS) scores &ge;4. Results: The median (IQR, 25th&ndash;75th percentiles) intraoperative fentanyl and 48 h morphine dose in group 2-to-group 1 were 1.2 (1.1&ndash;1.5) vs. 4.5 (3.8&ndash;5.5) &micro;g&middot;kg&minus;1&middot;h&minus;1 (p &lt; 0.001) and 22.1 (0&ndash;40.4) vs. 60.6 (40&ndash;95.7) &micro;g/kg (p &lt; 0.001). The median (IQR) time to extubation in group 2-to-group 1 was 90 (60&ndash;105) vs. 360 (285&ndash;510) min (p &lt; 0.001). Two hours after ICU admission, 87.5% of ESPB patients were extubated compared to 0% of controls (p &lt; 0.001), and 87.5% were weaned off norepinephrine compared to 46.5% of controls (p &lt; 0.001). The median NRS scores at 0, 6, 12, 24, and 48 h after extubation were significantly decreased in group 2. There was no difference in opioid-related adverse events and length of stay. Conclusions: NOL index-directed ESPB reduced intraoperative fentanyl by 73.3% and 48 h morphine by 63.5%. It also hastened the extubation and liberation from vasopressor support and improved postoperative analgesia

    Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database

    Get PDF
    Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p &lt; 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p &lt; 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis

    No full text
    International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine
    corecore