21 research outputs found
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Defining the causes of sporadic Parkinson's disease in the global Parkinson's genetics program (GP2)
The Global Parkinson’s Genetics Program (GP2) will genotype over 150,000 participants from around the world, and integrate genetic and clinical data for use in large-scale analyses to dramatically expand our understanding of the genetic architecture of PD. This report details the workflow for cohort integration into the complex arm of GP2, and together with our outline of the monogenic hub in a companion paper, provides a generalizable blueprint for establishing large scale collaborative research consortia
Multi-ancestry genome-wide association meta-analysis of Parkinson?s disease
Although over 90 independent risk variants have been identified for Parkinson’s disease using genome-wide association studies, most studies have been performed in just one population at a time. Here we performed a large-scale multi-ancestry meta-analysis of Parkinson’s disease with 49,049 cases, 18,785 proxy cases and 2,458,063 controls including individuals of European, East Asian, Latin American and African ancestry. In a meta-analysis, we identified 78 independent genome-wide significant loci, including 12 potentially novel loci (MTF2, PIK3CA, ADD1, SYBU, IRS2, USP8, PIGL, FASN, MYLK2, USP25, EP300 and PPP6R2) and fine-mapped 6 putative causal variants at 6 known PD loci. By combining our results with publicly available eQTL data, we identified 25 putative risk genes in these novel loci whose expression is associated with PD risk. This work lays the groundwork for future efforts aimed at identifying PD loci in non-European populations
Bacteriological and Physicochemical Assessment of Water from Student Hostels of Osun State University, Main Campus, Osogbo, Southwest Nigeria
This study was conducted to investigate the potability of 15 samples of water, three each from boreholes (BH1-BH3) and hand-dug wells (HD1-HD3), and nine brands of sachet water (SW1-SW9) that were regularly patronized by the students of Osun State University, main campus, residing in private hostels in Osogbo metropolis. The objective of the study was to determine the quality of such water samples. Borehole and well water samples from selected areas and samples of sachet water regularly vended by different manufacturers and vendors were collected, and subjected to physical, chemical and bacteriological analysis. For physical and chemical analysis Wagtech’s photometer plus chemical reagents were used. Total heterotrophic bacteria, total coliforms, and feacal coliforms in the water samples were obtained using, respectively, the pour plate method, membrane filtration and growth on MacConkey agar as well as Eosin Methylene Blue agar. The results showed that all of the samples of sachet water exhibited values of physical / organoleptic parameters, inorganic constituents, and mean coliform and E. coli counts per 100 ml below the WHO/SON maximum permissible levels; and were therefore, considered safe for drinking. There were slightly elevated levels of iron in water samples from the borehole category, BH2 and BH3 with no known health impacts. This is because iron is an essential element in human nutrition. Taste is not usually noticeable at iron concentrations below 0.3 mg/l. Although iron concentrations of 1–3 mg/l can be acceptable for people drinking well-water, no health-based guideline value for iron has been proposed. However, there were slightly elevated nitrate levels in samples from hand-dug wells, HD1 and HD3, suggesting that these water sources were not safe for consumption by infants under three months old. In addition, evidence of feacal coliform in water samples from hand-dug wells HD1 and HD2 suggest that they were not safe for human drinking. It is recommended that water from hand dug wells should be boiled before consumption to ensure public health and safety
Perceptions and Use of Antimicrobials Among Staff of a University Community in Southwestern Nigeria
Public attitude and knowledge of antibiotics are determinants of
rational use of antibiotics and prevention of antimicrobial drug resistance. This study
assessed perception and use of antimicrobials among staff members of a University in
Southwestern Nigeria. Descriptive cross-sectional study among 450 staff members of Osun
State University in Southwestern Nigeria using multistage sampling method was carried
out. Semi-structured self-administered and pre-tested questionnaires were used in data
collection. Data were analyzed using the SPSS software Version 17.0. Binary logistic
regression models for the outcome variable of composite knowledge and attitude scores
toward antimicrobials and their possible predictors were done and level of significance
was set at p values ≤ .05 and confidence interval of 95% for all inferential analyses.
Mean age of respondents was 26.8 (±11.1) years, and 331 (73.6%) had up to tertiary-level
education. One hundred eighty-three (40.7%) and 267 (59.3%) had good and poor knowledge
scores, respectively; 175 (38.9%) had positive attitude whereas 275 (61.1%) had negative
attitude toward the use of antibiotics. About 279 (62.0%) were informed about judicious
use of antibiotics, 398 (88.4%) had ever used antibiotics in the past 1 year with the
Ampicillin and Cloxacillin combinations being the most commonly used. Eighty-eight
(22.1%) used antibiotics for more than 10 days as at the last use. Predictors for having
good knowledge and attitude include age, educational status, and ever having used
antibiotics. Inadequate knowledge and attitude toward antibiotics were observed, and
this necessitates sustained health education campaign to stakeholders on rational use of
antibiotics, especially toward prevention of antimicrobial resistance
Immunocompromised patients with acute respiratory distress syndrome : Secondary analysis of the LUNG SAFE database
The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p < 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p < 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013
Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database
Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p < 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p < 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013
Recommended from our members
Elucidating causative gene variants in hereditary Parkinson's disease in the Global Parkinson's Genetics Program (GP2) (vol 9, 100, 2023)
Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis
International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine
Death in hospital following ICU discharge: insights from the LUNG SAFE study
ackground: To determine the frequency of, and factors associated with, death in hospital following ICU discharge to the ward.
Methods: The Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE study was an international, multicenter, prospective cohort study of patients with severe respiratory failure, conducted across 459 ICUs from 50 countries globally. This study aimed to understand the frequency and factors associated with death in hospital in patients who survived their ICU stay. We examined outcomes in the subpopulation discharged with no limitations of life sustaining treatments ('treatment limitations'), and the subpopulations with treatment limitations.
Results: 2186 (94%) patients with no treatment limitations discharged from ICU survived, while 142 (6%) died in hospital. 118 (61%) of patients with treatment limitations survived while 77 (39%) patients died in hospital. Patients without treatment limitations that died in hospital after ICU discharge were older, more likely to have COPD, immunocompromise or chronic renal failure, less likely to have trauma as a risk factor for ARDS. Patients that died post ICU discharge were less likely to receive neuromuscular blockade, or to receive any adjunctive measure, and had a higher pre- ICU discharge non-pulmonary SOFA score. A similar pattern was seen in patients with treatment limitations that died in hospital following ICU discharge.
Conclusions: A significant proportion of patients die in hospital following discharge from ICU, with higher mortality in patients with limitations of life-sustaining treatments in place. Non-survivors had higher systemic illness severity scores at ICU discharge than survivors