24 research outputs found

    Outcome in patients perceived as receiving excessive care across different ethical climates: a prospective study in 68 intensive care units in Europe and the USA

    Get PDF
    Purpose: Whether the quality of the ethical climate in the intensive care unit (ICU) improves the identification of patients receiving excessive care and affects patient outcomes is unknown. Methods: In this prospective observational study, perceptions of excessive care (PECs) by clinicians working in 68 ICUs in Europe and the USA were collected daily during a 28-day period. The quality of the ethical climate in the ICUs was assessed via a validated questionnaire. We compared the combined endpoint (death, not at home or poor quality of life at 1 year) of patients with PECs and the time from PECs until written treatment-limitation decisions (TLDs) and death across the four climates defined via cluster analysis. Results: Of the 4747 eligible clinicians, 2992 (63%) evaluated the ethical climate in their ICU. Of the 321 and 623 patients not admitted for monitoring only in ICUs with a good (n = 12, 18%) and poor (n = 24, 35%) climate, 36 (11%) and 74 (12%), respectively were identified with PECs by at least two clinicians. Of the 35 and 71 identified patients with an available combined endpoint, 100% (95% CI 90.0–1.00) and 85.9% (75.4–92.0) (P = 0.02) attained that endpoint. The risk of death (HR 1.88, 95% CI 1.20–2.92) or receiving a written TLD (HR 2.32, CI 1.11–4.85) in patients with PECs by at least two clinicians was higher in ICUs with a good climate than in those with a poor one. The differences between ICUs with an average climate, with (n = 12, 18%) or without (n = 20, 29%) nursing involvement at the end of life, and ICUs with a poor climate were less obvious but still in favour of the former. Conclusion: Enhancing the quality of the ethical climate in the ICU may improve both the identification of patients receiving excessive care and the decision-making process at the end of life

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Autopsy Study Defines Composition and Dynamics of the HIV-1 Reservoir after Allogeneic Hematopoietic Stem Cell Transplantation with CCR5&Delta;32/&Delta;32 Donor Cells

    Get PDF
    Allo-HSCT with CCR5&Delta;32/&Delta;32 donor cells is the only curative HIV-1 intervention. We investigated the impact of allo-HSCT on the viral reservoir in PBMCs and post-mortem tissue in two patients. IciS-05 and IciS-11 both received a CCR5&Delta;32/&Delta;32 allo-HSCT. Before allo-HSCT, ultrasensitive HIV-1 RNA quantification; HIV-1-DNA quantification; co-receptor tropism analysis; deep-sequencing and viral characterization in PBMCs and bone marrow; and post-allo-HSCT, ultrasensitive RNA and HIV-1-DNA quantification were performed. Proviral quantification, deep sequencing, and viral characterization were done in post-mortem tissue samples. Both patients harbored subtype B CCR5-tropic HIV-1 as determined genotypically and functionally by virus culture. Pre-allo-HSCT, HIV-1-DNA could be detected in both patients in bone marrow, PBMCs, and T-cell subsets. Chimerism correlated with detectable HIV-1-DNA LTR copies in cells and tissues. Post-mortem analysis of IciS-05 revealed proviral DNA in all tissue biopsies, but not in PBMCs. In patient IciS-11, who was transplanted twice, no HIV-1-DNA could be detected in PBMCs at the time of death, whereas HIV-1-DNA was detectable in the lymph node. In conclusion, shortly after CCR5&Delta;32/&Delta;32, allo-HSCT HIV-1-DNA became undetectable in PBMCs. However, HIV-1-DNA variants identical to those present before transplantation persisted in post-mortem-obtained tissues, indicating that these tissues play an important role as viral reservoirs

    Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database

    Get PDF
    Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p &lt; 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p &lt; 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013

    AOB community structure and richness under European beech, sessile oak, Norway spruce and Douglas-fir at three temperate forest sites

    Full text link
    Abstract Background and aims The relations between tree species, microbial diversity and activity can alter ecosystem functioning. We investigated ammonia oxidizing bacteria (AOB) community structure and richness, microbial/environmental factors related to AOB diversity and the relationship between AOB diversity and the nitrification process under several tree species. Methods Forest floor (Of, Oh) was sampled under European beech, sessile oak, Norway spruce and Douglas-fir at three sites. AOB community structure was assessed by PCR-DGGE and sequencing. Samples were analyzed for net N mineralization, potential nitrification, basal respiration, microbial biomass, microbial or metabolic quotient, pH, total nitrogen, extractable ammonium, organic matter content and exchangeable cations. Results AOB community structure and tree species effect on AOB diversity were site-specific. AOB richness was not related to nitrification. Factors regulating ammonium availability, i.e. net N mineralization or microbial biomass, were related to AOB community structure. Conclusion Our research shows that, at larger spatial scales, site specific characteristics may be more important than the nature of tree species in determining AOB diversity (richness and community structure). Within sites, tree species influence AOB diversity. The absence of a relation between AOB richness and nitrification points to a possibly role of AOB abundance, phenotypic plasticity or the implication of ammonia oxidizing archaea

    High treatment uptake in human immunodeficiency virus/ hepatitis C virus-coinfected patients after unrestricted access to direct-acting antivirals in the Netherlands

    No full text
    Background The Netherlands has provided unrestricted access to direct-acting antivirals (DAAs) since November 2015. We analyzed the nationwide hepatitis C virus (HCV) treatment uptake among patients coinfected with human immunodeficiency virus (HIV) and HCV. Methods Data were obtained from the ATHENA HIV observational cohort in which >98% of HIV-infected patients ever registered since 1998 are included. Patients were included if they ever had 1 positive HCV RNA result, did not have spontaneous clearance, and were known to still be in care. Treatment uptake and outcome were assessed. When patients were treated more than once, data were included from only the most recent treatment episode. Data were updated until February 2017. In addition, each treatment center was queried in April 2017 for a data update on DAA treatment and achieved sustained virological response. Results Of 23574 HIV-infected patients ever linked to care, 1471 HCV-coinfected patients (69% men who have sex with men, 15% persons who [formerly] injected drugs, and 15% with another HIV transmission route) fulfilled the inclusion criteria. Of these, 87% (1284 of 1471) had ever initiated HCV treatment between 2000 and 2017, 76% (1124 of 1471) had their HCV infection cured; DAA treatment results were pending in 6% (92 of 1471). Among men who have sex with men, 83% (844 of 1022) had their HCV infection cured, and DAA treatment results were pending in 6% (66 of 1022). Overall, 187 patients had never initiated treatment, DAAs had failed in 14, and a pegylated interferon-alfa–based regimen had failed in 54. Conclusions Fifteen months after unrestricted DAA availability the majority of HIV/HCV-coinfected patients in the Netherlands have their HCV infection cured (76%) or are awaiting DAA treatment results (6%). This rapid treatment scale-up may contribute to future HCV elimination among these patients

    Outcome in patients perceived as receiving excessive care across different ethical climates: a prospective study in 68 intensive care units in Europe and the USA

    No full text

    Validation and utility of ARDS subphenotypes identified by machine-learning models using clinical data: an observational, multicohort, retrospective analysis

    No full text
    International audienceTwo acute respiratory distress syndrome (ARDS) subphenotypes (hyperinflammatory and hypoinflammatory) with distinct clinical and biological features and differential treatment responses have been identified using latent class analysis (LCA) in seven individual cohorts. To facilitate bedside identification of subphenotypes, clinical classifier models using readily available clinical variables have been described in four randomised controlled trials. We aimed to assess the performance of these models in observational cohorts of ARDS. Methods: In this observational, multicohort, retrospective study, we validated two machine-learning clinical classifier models for assigning ARDS subphenotypes in two observational cohorts of patients with ARDS: Early Assessment of Renal and Lung Injury (EARLI; n=335) and Validating Acute Lung Injury Markers for Diagnosis (VALID; n=452), with LCA-derived subphenotypes as the gold standard. The primary model comprised only vital signs and laboratory variables, and the secondary model comprised all predictors in the primary model, with the addition of ventilatory variables and demographics. Model performance was assessed by calculating the area under the receiver operating characteristic curve (AUC) and calibration plots, and assigning subphenotypes using a probability cutoff value of 0·5 to determine sensitivity, specificity, and accuracy of the assignments. We also assessed the performance of the primary model in EARLI using data automatically extracted from an electronic health record (EHR; EHR-derived EARLI cohort). In Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE; n=2813), a multinational, observational ARDS cohort, we applied a custom classifier model (with fewer variables than the primary model) to determine the prognostic value of the subphenotypes and tested their interaction with the positive end-expiratory pressure (PEEP) strategy, with 90-day mortality as the dependent variable. Findings: The primary clinical classifier model had an area under receiver operating characteristic curve (AUC) of 0·92 (95% CI 0·90–0·95) in EARLI and 0·88 (0·84–0·91) in VALID. Performance of the primary model was similar when using exclusively EHR-derived predictors compared with manually curated predictors (AUC=0·88 [95% CI 0·81–0·94] vs 0·92 [0·88–0·97]). In LUNG SAFE, 90-day mortality was higher in patients assigned the hyperinflammatory subphenotype than in those with the hypoinflammatory phenotype (414 [57%] of 725 vs 694 [33%] of 2088; p<0·0001). There was a significant treatment interaction with PEEP strategy and ARDS subphenotype (p=0·041), with lower 90-day mortality in the high PEEP group of patients with the hyperinflammatory subphenotype (hyperinflammatory subphenotype: 169 [54%] of 313 patients in the high PEEP group vs 127 [62%] of 205 patients in the low PEEP group; hypoinflammatory subphenotype: 231 [34%] of 675 patients in the high PEEP group vs 233 [32%] of 734 patients in the low PEEP group). Interpretation: Classifier models using clinical variables alone can accurately assign ARDS subphenotypes in observational cohorts. Application of these models can provide valuable prognostic information and could inform management strategies for personalised treatment, including application of PEEP, once prospectively validated. Funding: US National Institutes of Health and European Society of Intensive Care Medicine
    corecore