483 research outputs found
Dynamics of disease characteristics and clinical management of critically ill COVID-19 patients over the time course of the pandemic: an analysis of the prospective, international, multicentre RISC-19-ICU registry.
BACKGROUND
It remains elusive how the characteristics, the course of disease, the clinical management and the outcomes of critically ill COVID-19 patients admitted to intensive care units (ICU) worldwide have changed over the course of the pandemic.
METHODS
Prospective, observational registry constituted by 90 ICUs across 22 countries worldwide including patients with a laboratory-confirmed, critical presentation of COVID-19 requiring advanced organ support. Hierarchical, generalized linear mixed-effect models accounting for hospital and country variability were employed to analyse the continuous evolution of the studied variables over the pandemic.
RESULTS
Four thousand forty-one patients were included from March 2020 to September 2021. Over this period, the age of the admitted patients (62 [95% CI 60-63] years vs 64 [62-66] years, p < 0.001) and the severity of organ dysfunction at ICU admission decreased (Sequential Organ Failure Assessment 8.2 [7.6-9.0] vs 5.8 [5.3-6.4], p < 0.001) and increased, while more female patients (26 [23-29]% vs 41 [35-48]%, p < 0.001) were admitted. The time span between symptom onset and hospitalization as well as ICU admission became longer later in the pandemic (6.7 [6.2-7.2| days vs 9.7 [8.9-10.5] days, p < 0.001). The PaO2/FiO2 at admission was lower (132 [123-141] mmHg vs 101 [91-113] mmHg, p < 0.001) but showed faster improvements over the initial 5 days of ICU stay in late 2021 compared to early 2020 (34 [20-48] mmHg vs 70 [41-100] mmHg, p = 0.05). The number of patients treated with steroids and tocilizumab increased, while the use of therapeutic anticoagulation presented an inverse U-shaped behaviour over the course of the pandemic. The proportion of patients treated with high-flow oxygen (5 [4-7]% vs 20 [14-29], p < 0.001) and non-invasive mechanical ventilation (14 [11-18]% vs 24 [17-33]%, p < 0.001) throughout the pandemic increased concomitant to a decrease in invasive mechanical ventilation (82 [76-86]% vs 74 [64-82]%, p < 0.001). The ICU mortality (23 [19-26]% vs 17 [12-25]%, p < 0.001) and length of stay (14 [13-16] days vs 11 [10-13] days, p < 0.001) decreased over 19 months of the pandemic.
CONCLUSION
Characteristics and disease course of critically ill COVID-19 patients have continuously evolved, concomitant to the clinical management, throughout the pandemic leading to a younger, less severely ill ICU population with distinctly different clinical, pulmonary and inflammatory presentations than at the onset of the pandemic
Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study
Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≥ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe
Global disparities in surgeons’ workloads, academic engagement and rest periods: the on-calL shIft fOr geNEral SurgeonS (LIONESS) study
: The workload of general surgeons is multifaceted, encompassing not only surgical procedures but also a myriad of other responsibilities. From April to May 2023, we conducted a CHERRIES-compliant internet-based survey analyzing clinical practice, academic engagement, and post-on-call rest. The questionnaire featured six sections with 35 questions. Statistical analysis used Chi-square tests, ANOVA, and logistic regression (SPSS® v. 28). The survey received a total of 1.046 responses (65.4%). Over 78.0% of responders came from Europe, 65.1% came from a general surgery unit; 92.8% of European and 87.5% of North American respondents were involved in research, compared to 71.7% in Africa. Europe led in publishing research studies (6.6 ± 8.6 yearly). Teaching involvement was high in North America (100%) and Africa (91.7%). Surgeons reported an average of 6.7 ± 4.9 on-call shifts per month, with European and North American surgeons experiencing 6.5 ± 4.9 and 7.8 ± 4.1 on-calls monthly, respectively. African surgeons had the highest on-call frequency (8.7 ± 6.1). Post-on-call, only 35.1% of respondents received a day off. Europeans were most likely (40%) to have a day off, while African surgeons were least likely (6.7%). On the adjusted multivariable analysis HDI (Human Development Index) (aOR 1.993) hospital capacity > 400 beds (aOR 2.423), working in a specialty surgery unit (aOR 2.087), and making the on-call in-house (aOR 5.446), significantly predicted the likelihood of having a day off after an on-call shift. Our study revealed critical insights into the disparities in workload, access to research, and professional opportunities for surgeons across different continents, underscored by the HDI
How Do Droughts And Wildfires Alter Seasonal Radial Growth In Mediterranean Aleppo Pine Forests?
Climate models project increasing temperatures, evapotranspiration, and droughts for the Mediterranean Basin, which will trigger more frequent and dangerous fire events. Here, we evaluate the combined effects of drought and wildfire on seasonal tree growth on Aleppo pine stands at the intra-and inter-annual level. Indexed earlywood width (EWI), latewood width (LWI), and latewood proportion (LWPI) series were obtained from unburned and burned stands located at four sites along a precipitation gradient in southeastern Spain. The combined effect of drought in 1994 and 1995 and wildfire in August 1994, negatively impacted seasonal growth in the short term (1994-1999 period) at the site with higher water availability. At the driest site, however, no significant effects were found. We found fewer negative pointer years at the wettest burned stand than at the wettest unburned stand during the post-fire 1994-2012 period, and the opposite pattern was found at the driest site, i.e. more negative pointer years at the driest burned stand than at the driest unburned stand. This result indicates that the drier sites were more sensitive to cumulative impact of drought and wildfire disturbances in the long term, whereas the wetter sites were more sensitive in the short term. Our results demonstrate the seasonal growth plasticity of Aleppo pine to combined disturbances depends on site water availability. This study will help forest managers to implement climate change strategies, such as prescribed fires (controlled low-medium severity fires) to prevent wildfire hazards more efficiently in Aleppo pine stands with high water availability.This item is part of the Tree-Ring Research (formerly Tree-Ring Bulletin) archive. For more information about this peer-reviewed scholarly journal, please email the Editor of Tree-Ring Research at [email protected]
Recommended from our members
Seasonal and synoptic climatic drivers of tree growth in the Bighorn Mountains, WY, USA (1654–1983 CE)
In the United States' (US) Northern Rockies, synoptic pressure systems and atmospheric circulation drive interannual variation in seasonal temperature and precipitation. The radial growth of high-elevation trees in this semi-arid region captures this temperature and precipitation variability and provides long time series to contextualize instrumental-era variability in synoptic-scale climate patterns. Such variability in climate patterns can trigger extreme climate events, such as droughts, floods, and forest fires, which have a damaging impact on human and natural systems. We developed 11 tree-ring width (TRW) chronologies from multiple species and sites to investigate the seasonal climatic drivers of tree growth in the Bighorn Mountains, WY. A principal component analysis of the chronologies identified 54% of shared common variance (1894-2014). Tree growth (expressed by PC1) was driven by multiple seasonal climate variables: previous October and current July temperatures, as well as previous December and current April precipitation, had a positive influence on growth, whereas growth was limited by July precipitation. These seasonal growth-climate relationships corresponded to circulation patterns at higher atmospheric levels over the Bighorn Mountains. Tree growth was enhanced when the winter jet stream was in a northward position, which led to warmer winters, and when the spring jet stream was further south, which led to wetter springs. The second principal component, explaining 19% of the variance, clustered sites by elevation and was strongly related to summer temperature. We leverage this summer temperature signal in our TRW chronologies by combining it with an existing maximum latewood density (MXD) chronology in a nested approach. This allowed us to reconstruct Bighorn Mountains summer (June, July, and August) temperature (BMST) back to 1654, thus extending the instrumental temperature record by 250 years. Our BMST reconstruction explains 39-53% of the variance in regional summer temperature variability. The 1830s were the relatively coolest decade and the 1930s were the warmest decade over the reconstructed period (1654-1983 CE) - which excludes the most recent 3 decades. Our results contextualize recent drivers and trends of climate variability in the US Northern Rockies, which contributes to the information that managers of human and natural systems need in order to prepare for potential future variability.24 month embargo; published online: 1 December 2019This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at [email protected]
Vegetation structure of planted versus natural Aleppo pine stands along a climatic gradient in Spain
Immunocompromised patients with acute respiratory distress syndrome: Secondary analysis of the LUNG SAFE database
Background: The aim of this study was to describe data on epidemiology, ventilatory management, and outcome of acute respiratory distress syndrome (ARDS) in immunocompromised patients. Methods: We performed a post hoc analysis on the cohort of immunocompromised patients enrolled in the Large Observational Study to Understand the Global Impact of Severe Acute Respiratory Failure (LUNG SAFE) study. The LUNG SAFE study was an international, prospective study including hypoxemic patients in 459 ICUs from 50 countries across 5 continents. Results: Of 2813 patients with ARDS, 584 (20.8%) were immunocompromised, 38.9% of whom had an unspecified cause. Pneumonia, nonpulmonary sepsis, and noncardiogenic shock were their most common risk factors for ARDS. Hospital mortality was higher in immunocompromised than in immunocompetent patients (52.4% vs 36.2%; p < 0.0001), despite similar severity of ARDS. Decisions regarding limiting life-sustaining measures were significantly more frequent in immunocompromised patients (27.1% vs 18.6%; p < 0.0001). Use of noninvasive ventilation (NIV) as first-line treatment was higher in immunocompromised patients (20.9% vs 15.9%; p = 0.0048), and immunodeficiency remained independently associated with the use of NIV after adjustment for confounders. Forty-eight percent of the patients treated with NIV were intubated, and their mortality was not different from that of the patients invasively ventilated ab initio. Conclusions: Immunosuppression is frequent in patients with ARDS, and infections are the main risk factors for ARDS in these immunocompromised patients. Their management differs from that of immunocompetent patients, particularly the greater use of NIV as first-line ventilation strategy. Compared with immunocompetent subjects, they have higher mortality regardless of ARDS severity as well as a higher frequency of limitation of life-sustaining measures. Nonetheless, nearly half of these patients survive to hospital discharge. Trial registration: ClinicalTrials.gov, NCT02010073. Registered on 12 December 2013
Risk factors and rate of recurrence after Mohs surgery in basal cell and squamous cell carcinomas: a nationwide prospective cohort (REGESMOHS, Spanish Registry of Mohs Surgery)
Randomized studies to assess the efficacy of Mohs micrographic surgery in basal cell and squamous cell carcinomas are limited by methodological and ethical issues and a lack of long follow-up periods. This study presents the "real-life" results of a nationwide 7-years cohort on basal cell carcinoma and squamous cell carcinoma treated with Mohs micrographic surgery. A prospective cohort was conducted in 22 Spanish centres (from July 2013 to February 2020) and a multivariate analysis, including characteristics of patients, tumours, surgeries and follow-up, was performed. A total of 4,402 patients followed up for 12,111 patient-years for basal cell carcinoma, and 371 patients with 915 patient-years of follow-up for squamous cell carcinoma were recruited. Risk factors for recurrence included age, non-primary tumours and more stages or unfinished surgeries for both tumours, and immunosuppression for squamous cell carcinoma. Incidence rates of recurrence were 1.3 per 100 person-years for basal cell carcinoma (95% confidence interval 1.1-1.5) and 4.5 for squamous cell carcinoma (95% confidence interval 3.3-6.1), being constant over time (0-5 years). In conclusion, follow-up strategies should be equally intense for at least the first 5 years, with special attention paid to squamous cell carcinoma (especially in immunosuppressed patients), elderly patients, non-primary tumours, and those procedures requiring more stages, or unfinished surgeries
- …