26 research outputs found

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Recommendations of the Colombian consensus committee for the management of traumatic brain injury in prehospital, emergency department, surgery, and intensive care (beyond one option for treatment of traumatic brain injury: a stratified protocol [BOOTSTRAP])

    No full text
    Traumatic brain injury (TBI) is a global public health problem. In Colombia, it is estimated that 70% of deaths from violence and 90% of deaths from road traffic accidents are TBI related. In the year 2014, the Ministry of Health of Colombia funded the development of a clinical practice guideline (CPG) for the diagnosis and treatment of adult patients with severe TBI. A critical barrier to the widespread implementation was identified—that is, the lack of a specific protocol that spans various levels of resources and complexity across the four treatment phases. The objective of this article is to present the process and recommendations for the management of patients with TBI in various resource environments, across the treatment phases of prehospital care, emergency department (ED), surgery, and intensive care unit. Using the Delphi methodology, a consensus of 20 experts in emergency medicine, neurosurgery, prehospital care, and intensive care nationwide developed recommendations based on 13 questions for the management of patients with TBI in Colombia. It is estimated that 80% of the global population live in developing economies where access to resources required for optimum treatment is limited. There is limitation for applications of CPGs recommendations in areas where there is low availability or absence of resources for integral care. Development of mixed methods consensus, including evidence review and expertise points of good clinical practices can fill gaps in application of CPGs. BOOTStraP (Beyond One Option for Treatment of Traumatic Brain Injury: A Stratified Protocol) is intended to be a practical handbook for care providers to use to treat TBI patients with whatever resources are available. Stratification of recommendations for interventions according to the availability of the resources on different stages of integral care is a proposed method for filling gaps in actual evidence, to organize a better strategy for interventions in different real-life scenarios. We develop 10 algorithms of management for building TBI protocols based on expert consensus to articulate treatment options in prehospital care, EDs, neurological surgery, and intensive care, independent of the level of availability of resources for care

    CNL and aCML should be considered as single entity based on molecular profiles and outcomes

    Get PDF
    Chronic neutrophilic leukemia (CNL) and atypical chronic myeloid leukemia (aCML) are rare myeloid disorders that are challenging with regard to diagnosis and clinical management. To study the similarities and differences of these disorders we undertook a multi-center international study of one of the largest case series (CNL, n=24; aCML, n=37 cases, respectively), focusing on the clinical and mutational profiles (n=53 with molecular data) of these diseases. We found no differences in clinical presentation or outcomes between both entities. As previously described, both CNL and aCML share a complex mutational profile with mutations in genes involved in epigenetic regulation, splicing and signaling pathways. Apart from CSF3R, only EZH2 and TET2 were differentially mutated between them. The molecular profiles support the notion of CNL and aCML being a continuum of the same disease that may fit best within the myelodysplastic/myeloproliferative neoplasms (MDS/MPN). We identified four high-risk mutated genes, specifically CEBPA (β=2.26, HR=9.54, p=0.003), EZH2 (β=1.12, HR=3.062, p=0.009), NRAS (β=1.29, HR=3.63, p=0.048) and U2AF1 (β=1.75, HR=5.74, p=0.013) by multivariate analysis. Our findings underscore the relevance of molecular-risk classification in CNL/aCML as well as the importance of CSF3R mutations in these diseases.</p

    Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database

    Get PDF
    Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p &lt; 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p &lt; 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    Genetic landscape of 6089 inherited retinal dystrophies affected cases in Spain and their therapeutic and extended epidemiological implications

    No full text
    Inherited retinal diseases (IRDs), defined by dysfunction or progressive loss of photoreceptors, are disorders characterized by elevated heterogeneity, both at the clinical and genetic levels. Our main goal was to address the genetic landscape of IRD in the largest cohort of Spanish patients reported to date. A retrospective hospital-based cross-sectional study was carried out on 6089 IRD affected individuals (from 4403 unrelated families), referred for genetic testing from all the Spanish autonomous communities. Clinical, demographic and familiar data were collected from each patient, including family pedigree, age of appearance of visual symptoms, presence of any systemic findings and geographical origin. Genetic studies were performed to the 3951 families with available DNA using different molecular techniques. Overall, 53.2% (2100/3951) of the studied families were genetically characterized, and 1549 different likely causative variants in 142 genes were identified. The most common phenotype encountered is retinitis pigmentosa (RP) (55.6% of families, 2447/4403). The most recurrently mutated genes were PRPH2, ABCA4 and RS1 in autosomal dominant (AD), autosomal recessive (AR) and X-linked (XL) NON-RP cases, respectively; RHO, USH2A and RPGR in AD, AR and XL for non-syndromic RP; and USH2A and MYO7A in syndromic IRD. Pathogenic variants c.3386G > T (p.Arg1129Leu) in ABCA4 and c.2276G > T (p.Cys759Phe) in USH2A were the most frequent variants identified. Our study provides the general landscape for IRD in Spain, reporting the largest cohort ever presented. Our results have important implications for genetic diagnosis, counselling and new therapeutic strategies to both the Spanish population and other related populations

    Anticoagulant selection in relation to the SAMe-TT2R2 score in patients with atrial fibrillation: The GLORIA-AF registry

    No full text
    Aim: The SAMe-TT2R2 score helps identify patients with atrial fibrillation (AF) likely to have poor anticoagulation control during anticoagulation with vitamin K antagonists (VKA) and those with scores &gt;2 might be better managed with a target-specific oral anticoagulant (NOAC). We hypothesized that in clinical practice, VKAs may be prescribed less frequently to patients with AF and SAMe-TT2R2 scores &gt;2 than to patients with lower scores. Methods and results: We analyzed the Phase III dataset of the Global Registry on Long-Term Oral Antithrombotic Treatment in Patients with Atrial Fibrillation (GLORIA-AF), a large, global, prospective global registry of patients with newly diagnosed AF and ≥1 stroke risk factor. We compared baseline clinical characteristics and antithrombotic prescriptions to determine the probability of the VKA prescription among anticoagulated patients with the baseline SAMe-TT2R2 score &gt;2 and ≤ 2. Among 17,465 anticoagulated patients with AF, 4,828 (27.6%) patients were prescribed VKA and 12,637 (72.4%) patients an NOAC: 11,884 (68.0%) patients had SAMe-TT2R2 scores 0-2 and 5,581 (32.0%) patients had scores &gt;2. The proportion of patients prescribed VKA was 28.0% among patients with SAMe-TT2R2 scores &gt;2 and 27.5% in those with scores ≤2. Conclusions: The lack of a clear association between the SAMe-TT2R2 score and anticoagulant selection may be attributed to the relative efficacy and safety profiles between NOACs and VKAs as well as to the absence of trial evidence that an SAMe-TT2R2-guided strategy for the selection of the type of anticoagulation in NVAF patients has an impact on clinical outcomes of efficacy and safety. The latter hypothesis is currently being tested in a randomized controlled trial. Clinical trial registration: URL: https://www.clinicaltrials.gov//Unique identifier: NCT01937377, NCT01468701, and NCT01671007. © 2020 Hellenic Society of Cardiolog

    Noninvasive Ventilation of Patients with Acute Respiratory Distress Syndrome: Insights from the LUNG SAFE Study

    No full text
    Rationale: Noninvasive ventilation (NIV) is increasingly used in patients with acute respiratory distress syndrome (ARDS). The evidence supporting NIV use in patients with ARDS remains relatively sparse. Objectives: To determine whether, during NIV, the categorization of ARDS severity based on the PaO2/FIO2Berlin criteria is useful. Methods: TheLUNGSAFE(Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure) study described the management of patients with ARDS. This substudy examines the current practice of NIV use in ARDS, the utility of the PaO2/FIO2ratio in classifying patients receiving NIV, and the impact of NIV on outcome. MeasurementsandMain Results:Of2,813 patients with ARDS,436 (15.5%) were managed with NIV on Days 1 and 2 following fulfillment of diagnosticcriteria.Classification of ARDS severity based on PaO2/FIO2ratio was associated with an increase in intensity of ventilatory support, NIV failure, and intensive care unit (ICU) mortality. NIV failure occurred in 22.2% of mild, 42.3% of moderate, and 47.1% of patients with severe ARDS. Hospital mortality in patients with NIV success and failure was 16.1% and 45.4%, respectively. NIV use was independently associated with increased ICU (hazard ratio, 1.446 [95% confidence interval, 1.159-1.805]), but not hospital, mortality. In a propensity matched analysis, ICU mortality was higher in NIV than invasively ventilated patients with a PaO2/FIO2lower than 150 mm Hg. Conclusions:NIV was used in 15% of patients with ARDS,irrespective of severity category. NIV seems to be associated with higher ICU mortality in patients with a PaO2/FIO2lower than 150 mm Hg

    Mechanical ventilation in patients with cardiogenic pulmonary edema : a sub-analysis of the LUNG SAFE study

    No full text
    Patients with acute respiratory failure caused by cardiogenic pulmonary edema (CPE) may require mechanical ventilation that can cause further lung damage. Our aim was to determine the impact of ventilatory settings on CPE mortality. Patients from the LUNG SAFE cohort, a multicenter prospective cohort study of patients undergoing mechanical ventilation, were studied. Relationships between ventilatory parameters and outcomes (ICU discharge/hospital mortality) were assessed using latent mixture analysis and a marginal structural model. From 4499 patients, 391 meeting CPE criteria (median age 70 [interquartile range 59-78], 40% female) were included. ICU and hospital mortality were 34% and 40%, respectively. ICU survivors were younger (67 [57-77] vs 74 [64-80] years, p < 0.001) and had lower driving (12 [8-16] vs 15 [11-17] cmHO, p < 0.001), plateau (20 [15-23] vs 22 [19-26] cmHO, p < 0.001) and peak (21 [17-27] vs 26 [20-32] cmHO, p < 0.001) pressures. Latent mixture analysis of patients receiving invasive mechanical ventilation on ICU day 1 revealed a subgroup ventilated with high pressures with lower probability of being discharged alive from the ICU (hazard ratio [HR] 0.79 [95% confidence interval 0.60-1.05], p = 0.103) and increased hospital mortality (HR 1.65 [1.16-2.36], p = 0.005). In a marginal structural model, driving pressures in the first week (HR 1.12 [1.06-1.18], p < 0.001) and tidal volume after day 7 (HR 0.69 [0.52-0.93], p = 0.015) were related to survival. Higher airway pressures in invasively ventilated patients with CPE are related to mortality. These patients may be exposed to an increased risk of ventilator-induced lung injury. Trial registration Clinicaltrials.gov NCT02010073

    Early stage litter decomposition across biomes

    Get PDF
    Through litter decomposition enormous amounts of carbon is emitted to the atmosphere. Numerous large-scale decomposition experiments have been conducted focusing on this fundamental soil process in order to understand the controls on the terrestrial carbon transfer to the atmosphere. However, previous studies were mostly based on site-specific litter and methodologies, adding major uncertainty to syntheses, comparisons and meta-analyses across different experiments and sites. In the TeaComposition initiative, the potential litter decomposition is investigated by using standardized substrates (Rooibos and Green tea) for comparison of litter mass loss at 336 sites (ranging from −9 to +26 °C MAT and from 60 to 3113 mm MAP) across different ecosystems. In this study we tested the effect of climate (temperature and moisture), litter type and land-use on early stage decomposition (3 months) across nine biomes. We show that litter quality was the predominant controlling factor in early stage litter decomposition, which explained about 65% of the variability in litter decomposition at a global scale. The effect of climate, on the other hand, was not litter specific and explained <0.5% of the variation for Green tea and 5% for Rooibos tea, and was of significance only under unfavorable decomposition conditions (i.e. xeric versus mesic environments). When the data were aggregated at the biome scale, climate played a significant role on decomposition of both litter types (explaining 64% of the variation for Green tea and 72% for Rooibos tea). No significant effect of land-use on early stage litter decomposition was noted within the temperate biome. Our results indicate that multiple drivers are affecting early stage litter mass loss with litter quality being dominant. In order to be able to quantify the relative importance of the different drivers over time, long-term studies combined with experimental trials are needed.This work was performed within the TeaComposition initiative, carried out by 190 institutions worldwide. We thank Gabrielle Drozdowski for her help with the packaging and shipping of tea, Zora Wessely and Johannes Spiegel for the creative implementation of the acknowledgement card, Josip Dusper for creative implementation of the graphical abstract, Christine Brendle for the GIS editing, and Marianne Debue for her help with the data cleaning. Further acknowledgements go to Adriana Principe, Melanie Köbel, Pedro Pinho, Thomas Parker, Steve Unger, Jon Gewirtzman and Margot McKleeven for the implementation of the study at their respective sites. We are very grateful to UNILEVER for sponsoring the Lipton tea bags and to the COST action ClimMani for scientific discussions, adoption and support to the idea of TeaComposition as a common metric. The initiative was supported by the following grants: ILTER Initiative Grant, ClimMani Short-Term Scientific Missions Grant (COST action ES1308; COST-STSM-ES1308-36004; COST-STM-ES1308-39006; ES1308-231015-068365), INTERACT (EU H2020 Grant No. 730938), and Austrian Environment Agency (UBA). Franz Zehetner acknowledges the support granted by the Prometeo Project of Ecuador's Secretariat of Higher Education, Science, Technology and Innovation (SENESCYT) as well as Charles Darwin Foundation for the Galapagos Islands (2190). Ana I. Sousa, Ana I. Lillebø and Marta Lopes thanks for the financial support to CESAM (UID/AMB/50017), to FCT/MEC through national funds (PIDDAC), and the co-funding by the FEDER, within the PT2020 Partnership Agreement and Compete 2020. The research was also funded by the Portuguese Foundation for Science and Technology, FCT, through SFRH/BPD/107823/2015 (A.I. Sousa), co-funded by POPH/FSE. Thomas Mozdzer thanks US National Science Foundation NSF DEB-1557009. Helena C. Serrano thanks Fundação para a Ciência e Tecnologia (UID/BIA/00329/2013). Milan Barna acknowledges Scientific Grant Agency VEGA (2/0101/18). Anzar A Khuroo acknowledges financial support under HIMADRI project from SAC-ISRO, India
    corecore