36 research outputs found

    Suitability of Dried Blood Spots for Accelerating Veterinary Biobank Collections and Identifying Metabolomics Biomarkers With Minimal Resources

    Get PDF
    Biomarker discovery using biobank samples collected from veterinary clinics would deliver insights into the diverse population of pets and accelerate diagnostic development. The acquisition, preparation, processing, and storage of biofluid samples in sufficient volumes and at a quality suitable for later analysis with most suitable discovery methods remain challenging. Metabolomics analysis is a valuable approach to detect health/disease phenotypes. Pre-processing changes during preparation of plasma/serum samples may induce variability that may be overcome using dried blood spots (DBSs). We report a proof of principle study by metabolite fingerprinting applying UHPLC-MS of plasma and DBSs acquired from healthy adult dogs and cats (age range 1-9 years), representing each of 4 dog breeds (Labrador retriever, Beagle, Petit Basset Griffon Vendeen, and Norfolk terrier) and the British domestic shorthair cat (n = 10 per group). Blood samples (20 and 40 μL) for DBSs were loaded onto filter paper, air-dried at room temperature (3 h), and sealed and stored (4°C for ~72 h) prior to storage at -80°C. Plasma from the same blood draw (250 μL) was prepared and stored at -80°C within 1 h of sampling. Metabolite fingerprinting of the DBSs and plasma produced similar numbers of metabolite features that had similar abilities to discriminate between biological classes and correctly assign blinded samples. These provide evidence that DBSs, sampled in a manner amenable to application in in-clinic/in-field processing, are a suitable sample for biomarker discovery using UHPLC-MS metabolomics. Further, given appropriate owner consent, the volumes tested (20-40 μL) make the acquisition of remnant blood from blood samples drawn for other reasons available for biobanking and other research activities. Together, this makes possible large-scale biobanking of veterinary samples, gaining sufficient material sooner and enabling quicker identification of biomarkers of interest

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Letter. Phosphorus cycling in the North and South Atlantic Ocean subtropical gyres

    No full text
    Despite similar physical properties, the Northern and Southern Atlantic subtropical gyres have different biogeochemical regimes. The Northern subtropical gyre, which is subject to iron deposition from Saharan dust1, is depleted in the nutrient phosphate, possibly as a result of iron-enhanced nitrogen fixation2. Although phosphate depleted, rates of carbon fixation in the euphotic zone of the North Atlantic subtropical gyre are comparable to those of the South Atlantic subtropical gyre3, which is not phosphate limited. Here we use the activity of the phosphorus-specific enzyme alkaline phosphatase to show potentially enhanced utilization of dissolved organic phosphorus occurring over much of the North Atlantic subtropical gyre. We find that during the boreal spring up to 30% of primary production in the North Atlantic gyre is supported by dissolved organic phosphorus. Our diagnostics and composite map of the surface distribution of dissolved organic phosphorus in the subtropical Atlantic Ocean reveal shorter residence times in the North Atlantic gyre than the South Atlantic gyre. We interpret the asymmetry of dissolved organic phosphorus cycling in the two gyres as a consequence of enhanced nitrogen fixation in the North Atlantic Ocean4, which forces the system towards phosphorus limitation. We suggest that dissolved organic phosphorus utilization may contribute to primary production in other phosphorus-limited ocean settings as well

    Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database

    Get PDF
    Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p &lt; 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p &lt; 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis

    No full text
    International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine

    Death in hospital following ICU discharge: insights from the LUNG SAFE study

    No full text
    ackground: To determine the frequency of, and factors associated with, death in hospital following ICU discharge to the ward. Methods: The Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE study was an international, multicenter, prospective cohort study of patients with severe respiratory failure, conducted across 459 ICUs from 50 countries globally. This study aimed to understand the frequency and factors associated with death in hospital in patients who survived their ICU stay. We examined outcomes in the subpopulation discharged with no limitations of life sustaining treatments ('treatment limitations'), and the subpopulations with treatment limitations. Results: 2186 (94%) patients with no treatment limitations discharged from ICU survived, while 142 (6%) died in hospital. 118 (61%) of patients with treatment limitations survived while 77 (39%) patients died in hospital. Patients without treatment limitations that died in hospital after ICU discharge were older, more likely to have COPD, immunocompromise or chronic renal failure, less likely to have trauma as a risk factor for ARDS. Patients that died post ICU discharge were less likely to receive neuromuscular blockade, or to receive any adjunctive measure, and had a higher pre- ICU discharge non-pulmonary SOFA score. A similar pattern was seen in patients with treatment limitations that died in hospital following ICU discharge. Conclusions: A significant proportion of patients die in hospital following discharge from ICU, with higher mortality in patients with limitations of life-sustaining treatments in place. Non-survivors had higher systemic illness severity scores at ICU discharge than survivors
    corecore