76 research outputs found
A review of source tracking techniques for fine sediment within a catchment
Excessive transport of fine sediment, and its associated pollutants, can cause detrimental impacts in aquatic environments. It is therefore important to perform accurate sediment source apportionment to identify hot spots of soil erosion. Various tracers have been adopted, often in combination, to identify sediment source type and its spatial origin; these include fallout radionuclides, geochemical tracers, mineral magnetic properties and bulk and compound-specific stable isotopes. In this review, the applicability of these techniques to particular settings and their advantages and limitations are reviewed. By synthesizing existing approaches, that make use of multiple tracers in combination with measured changes of channel geomorphological attributes, an integrated analysis of tracer profiles in deposited sediments in lakes and reservoirs can be made. Through a multi-scale approach for fine sediment tracking, temporal changes in soil erosion and sediment load can be reconstructed and the consequences of changing catchment practices evaluated. We recommend that long-term, as well as short-term, monitoring of riverine fine sediment and corresponding surface and subsurface sources at nested sites within a catchment are essential. Such monitoring will inform the development and validation of models for predicting dynamics of fine sediment transport as a function of hydro-climatic and geomorphological controls. We highlight that the need for monitoring is particularly important for hilly catchments with complex and changing land use. We recommend that research should be prioritized for sloping farmland-dominated catchments
A review of source tracking techniques for fine sediment within a catchment
Excessive transport of fine sediment, and its associated pollutants, can cause detrimental impacts in aquatic environments. It is therefore important to perform accurate sediment source apportionment to identify hot spots of soil erosion. Various tracers have been adopted, often in combination, to identify sediment source type and its spatial origin; these include fallout radionuclides, geochemical tracers, mineral magnetic properties and bulk and compound-specific stable isotopes. In this review, the applicability of these techniques to particular settings and their advantages and limitations are reviewed. By synthesizing existing approaches, that make use of multiple tracers in combination with measured changes of channel geomorphological attributes, an integrated analysis of tracer profiles in deposited sediments in lakes and reservoirs can be made. Through a multi-scale approach for fine sediment tracking, temporal changes in soil erosion and sediment load can be reconstructed and the consequences of changing catchment practices evaluated. We recommend that long-term, as well as short-term, monitoring of riverine fine sediment and corresponding surface and subsurface sources at nested sites within a catchment are essential. Such monitoring will inform the development and validation of models for predicting dynamics of fine sediment transport as a function of hydro-climatic and geomorphological controls. We highlight that the need for monitoring is particularly important for hilly catchments with complex and changing land use. We recommend that research should be prioritized for sloping farmland-dominated catchments
Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study
BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council
Procalcitonin Is Not a Reliable Biomarker of Bacterial Coinfection in People With Coronavirus Disease 2019 Undergoing Microbiological Investigation at the Time of Hospital Admission
Abstract Admission procalcitonin measurements and microbiology results were available for 1040 hospitalized adults with coronavirus disease 2019 (from 48 902 included in the International Severe Acute Respiratory and Emerging Infections Consortium World Health Organization Clinical Characterisation Protocol UK study). Although procalcitonin was higher in bacterial coinfection, this was neither clinically significant (median [IQR], 0.33 [0.11–1.70] ng/mL vs 0.24 [0.10–0.90] ng/mL) nor diagnostically useful (area under the receiver operating characteristic curve, 0.56 [95% confidence interval, .51–.60]).</jats:p
Recommended from our members
Effect of Hydrocortisone on Mortality and Organ Support in Patients With Severe COVID-19: The REMAP-CAP COVID-19 Corticosteroid Domain Randomized Clinical Trial.
Importance: Evidence regarding corticosteroid use for severe coronavirus disease 2019 (COVID-19) is limited. Objective: To determine whether hydrocortisone improves outcome for patients with severe COVID-19. Design, Setting, and Participants: An ongoing adaptive platform trial testing multiple interventions within multiple therapeutic domains, for example, antiviral agents, corticosteroids, or immunoglobulin. Between March 9 and June 17, 2020, 614 adult patients with suspected or confirmed COVID-19 were enrolled and randomized within at least 1 domain following admission to an intensive care unit (ICU) for respiratory or cardiovascular organ support at 121 sites in 8 countries. Of these, 403 were randomized to open-label interventions within the corticosteroid domain. The domain was halted after results from another trial were released. Follow-up ended August 12, 2020. Interventions: The corticosteroid domain randomized participants to a fixed 7-day course of intravenous hydrocortisone (50 mg or 100 mg every 6 hours) (n = 143), a shock-dependent course (50 mg every 6 hours when shock was clinically evident) (n = 152), or no hydrocortisone (n = 108). Main Outcomes and Measures: The primary end point was organ support-free days (days alive and free of ICU-based respiratory or cardiovascular support) within 21 days, where patients who died were assigned -1 day. The primary analysis was a bayesian cumulative logistic model that included all patients enrolled with severe COVID-19, adjusting for age, sex, site, region, time, assignment to interventions within other domains, and domain and intervention eligibility. Superiority was defined as the posterior probability of an odds ratio greater than 1 (threshold for trial conclusion of superiority >99%). Results: After excluding 19 participants who withdrew consent, there were 384 patients (mean age, 60 years; 29% female) randomized to the fixed-dose (n = 137), shock-dependent (n = 146), and no (n = 101) hydrocortisone groups; 379 (99%) completed the study and were included in the analysis. The mean age for the 3 groups ranged between 59.5 and 60.4 years; most patients were male (range, 70.6%-71.5%); mean body mass index ranged between 29.7 and 30.9; and patients receiving mechanical ventilation ranged between 50.0% and 63.5%. For the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively, the median organ support-free days were 0 (IQR, -1 to 15), 0 (IQR, -1 to 13), and 0 (-1 to 11) days (composed of 30%, 26%, and 33% mortality rates and 11.5, 9.5, and 6 median organ support-free days among survivors). The median adjusted odds ratio and bayesian probability of superiority were 1.43 (95% credible interval, 0.91-2.27) and 93% for fixed-dose hydrocortisone, respectively, and were 1.22 (95% credible interval, 0.76-1.94) and 80% for shock-dependent hydrocortisone compared with no hydrocortisone. Serious adverse events were reported in 4 (3%), 5 (3%), and 1 (1%) patients in the fixed-dose, shock-dependent, and no hydrocortisone groups, respectively. Conclusions and Relevance: Among patients with severe COVID-19, treatment with a 7-day fixed-dose course of hydrocortisone or shock-dependent dosing of hydrocortisone, compared with no hydrocortisone, resulted in 93% and 80% probabilities of superiority with regard to the odds of improvement in organ support-free days within 21 days. However, the trial was stopped early and no treatment strategy met prespecified criteria for statistical superiority, precluding definitive conclusions. Trial Registration: ClinicalTrials.gov Identifier: NCT02735707
Recommended from our members
BioEarth: Envisioning and developing a new regional earth system model to inform natural and agricultural resource management
As managers of agricultural and natural resources are confronted with uncertainties in global change impacts, the complexities associated with the interconnected cycling of nitrogen, carbon, and water present daunting management challenges. Existing models provide detailed information on specific sub-systems (e.g., land, air, water, and economics). An increasing awareness of the unintended consequences of management decisions resulting from interconnectedness of these sub-systems, however, necessitates coupled regional earth system models (EaSMs). Decision makers’ needs and priorities can be integrated into the model design and development processes to enhance decision-making relevance and “usability” of EaSMs. BioEarth is a research initiative currently under development with a focus on the U.S. Pacific Northwest region that explores the coupling of multiple stand-alone EaSMs to generate usable information for resource decision-making. Direct engagement between model developers and non-academic stakeholders involved in resource and environmental management decisions throughout the model development process is a critical component of this effort. BioEarth utilizes a bottom-up approach for its land surface model that preserves fine spatial-scale sensitivities and lateral hydrologic connectivity, which makes it unique among many regional EaSMs. This paper describes the BioEarth initiative and highlights opportunities and challenges associated with coupling multiple stand-alone models to generate usable information for agricultural and natural resource decision-making
Viral coinfections in hospitalized coronavirus disease 2019 patients recruited to the international severe acute respiratory and emerging infections consortium WHO clinical characterisation protocol UK study
Background
We conducted this study to assess the prevalence of viral coinfection in a well characterized cohort of hospitalized coronavirus disease 2019 (COVID-19) patients and to investigate the impact of coinfection on disease severity.
Methods
Multiplex real-time polymerase chain reaction testing for endemic respiratory viruses was performed on upper respiratory tract samples from 1002 patients with COVID-19, aged <1 year to 102 years old, recruited to the International Severe Acute Respiratory and Emerging Infections Consortium WHO Clinical Characterisation Protocol UK study. Comprehensive demographic, clinical, and outcome data were collected prospectively up to 28 days post discharge.
Results
A coinfecting virus was detected in 20 (2.0%) participants. Multivariable analysis revealed no significant risk factors for coinfection, although this may be due to rarity of coinfection. Likewise, ordinal logistic regression analysis did not demonstrate a significant association between coinfection and increased disease severity.
Conclusions
Viral coinfection was rare among hospitalized COVID-19 patients in the United Kingdom during the first 18 months of the pandemic. With unbiased prospective sampling, we found no evidence of an association between viral coinfection and disease severity. Public health interventions disrupted normal seasonal transmission of respiratory viruses; relaxation of these measures mean it will be important to monitor the prevalence and impact of respiratory viral coinfections going forward
- …