34 research outputs found

    Management Effects on Greenhouse Gas Dynamics in Fen Ditches

    Get PDF
    Globally, large areas of peatland have been drained through the digging of ditches, generally to increase agricultural production. By lowering the water table it is often assumed that drainage reduces landscape-scale emissions of methane (CH4) into the atmosphere to negligible levels. However, drainage ditches themselves are known to be sources of CH4 and other greenhouse gases (GHGs), but emissions data are scarce, particularly for carbon dioxide (CO2) and nitrous oxide (N2O), and show high spatial and temporal variability. Here, we report dissolved GHGs and diffusive fluxes of CH4 and CO2 from ditches at three UK lowland fens under different management; semi-natural fen, cropland, and cropland restored to low-intensity grassland. Ditches at all three fens emitted GHGs to the atmosphere, but both fluxes and dissolved GHGs showed extensive variation both seasonally and within-site. CH4 fluxes were particularly large, with medians peaking at all three sites in August at 120-230 mg m-2 d-1. Significant between site differences were detected between the cropland and the other two sites for CO2 flux and all three dissolved GHGs, suggested that intensive agriculture has major effects on ditch biogeochemistry. Multiple regression models using environmental and water chemistry data were able to explain 29-59% of observed variation in dissolved GHGs. Annual CH4 fluxes from the ditches were 37.8, 18.3 and 27.2 g CH4 m-2 yr-1 for the semi-natural, grassland and cropland, and annual CO2 fluxes were similar (1100 to 1440 g CO2 m-2 yr-1) among sites. We suggest that fen ditches are important contributors to landscape-scale GHG emissions, particularly for CH4. Ditch emissions should be included in GHG budgets of human modified fens, particularly where drainage has removed the original terrestrial CH4 source, e.g. agricultural peatlands

    Overriding water table control on managed peatland greenhouse gas emissions

    Get PDF
    Global peatlands store more carbon than is naturally present in the atmosphere1,2. However, many peatlands are under pressure from drainage-based agriculture, plantation development and fire, with the equivalent of around 3% of all anthropogenic greenhouse gases emitted from drained peatland3–5. Efforts to curb such emissions are intensifying through the conservation of undrained peatlands and rewetting of drained systems6. Here we report CO2 eddy covariance data from 16 locations and CH4 data from 41 locations in the British Isles, and combine them with published data from sites across all major peatland biomes. We find that the mean annual effective water-table depth (WTDe; that is, the average depth of the aerated peat layer) overrides all other ecosystem- and management-related controls on greenhouse gas fluxes. We estimate that every 10 cm of reduction in WTDe could reduce the net warming impact of CO2 and CH4 emissions (100-year Global Warming Potentials) by at least 3 t CO2e ha-1 yr-1, until WTDe is < 30 cm. Raising water levels further would continue to have a net cooling effect until WTDe is < 10 cm. Our results suggest that greenhouse gas emissions from peatlands drained for agriculture could be greatly reduced without necessarily halting their productive use. Halving WTDe in all drained agricultural peatlands, for example, could reduce emissions by the equivalent of over 1% of global anthropogenic emissions

    Eco-evolutionary dynamics on deformable fitness landscapes

    No full text
    Conventional approaches to modelling ecological dynamics often do not include evolutionary changes in the genetic makeup of component species and, conversely, conventional approaches to modelling evolutionary changes in the genetic makeup of a population often do not include ecological dynamics. But recently there has been considerable interest in understanding the interaction of evolutionary and ecological dynamics as coupled processes. However, in the context of complex multi-species ecosytems, especially where ecological and evolutionary timescales are similar, it is difficult to identify general organising principles that help us understand the structure and behaviour of complex ecosystems. Here we introduce a simple abstraction of coevolutionary interactions in a multi-species ecosystem. We model non-trophic ecological interactions based on a continuous but low-dimensional trait/niche space, where the location of each species in trait space affects the overlap of its resource utilisation with that of other species. The local depletion of available resources creates, in effect, a deformable fitness landscape that governs how the evolution of one species affects the selective pressures on other species. This enables us to study the coevolution of ecological interactions in an intuitive and easily visualisable manner. We observe that this model can exhibit either of the two behavioural modes discussed in the literature; namely, evolutionary stasis or Red Queen dynamics, i.e., continued evolutionary change. We find that which of these modes is observed depends on the lag or latency between the movement of a species in trait space and its effect on available resources. Specifically, if ecological change is nearly instantaneous compared to evolutionary change, stasis results; but conversely, if evolutionary timescales are closer to ecological timescales, such that resource depletion is not instantaneous on evolutionary timescales, then Red Queen dynamics result. We also observe that in the stasis mode, the overall utilisation of resources by the ecosystem is relatively efficient, with diverse species utilising different niches, whereas in the Red Queen mode the organisation of the ecosystem is such that species tend to clump together competing for overlapping resources. These models thereby suggest some basic conditions that influence the organisation of inter-species interactions and the balance of individual and collective adaptation in ecosystems, and likewise they also suggest factors that might be useful in engineering artificial coevolution

    Procalcitonin Is Not a Reliable Biomarker of Bacterial Coinfection in People With Coronavirus Disease 2019 Undergoing Microbiological Investigation at the Time of Hospital Admission

    Get PDF
    Abstract Admission procalcitonin measurements and microbiology results were available for 1040 hospitalized adults with coronavirus disease 2019 (from 48 902 included in the International Severe Acute Respiratory and Emerging Infections Consortium World Health Organization Clinical Characterisation Protocol UK study). Although procalcitonin was higher in bacterial coinfection, this was neither clinically significant (median [IQR], 0.33 [0.11–1.70] ng/mL vs 0.24 [0.10–0.90] ng/mL) nor diagnostically useful (area under the receiver operating characteristic curve, 0.56 [95% confidence interval, .51–.60]).</jats:p

    Adjunctive rifampicin for Staphylococcus aureus bacteraemia (ARREST): a multicentre, randomised, double-blind, placebo-controlled trial.

    Get PDF
    BACKGROUND: Staphylococcus aureus bacteraemia is a common cause of severe community-acquired and hospital-acquired infection worldwide. We tested the hypothesis that adjunctive rifampicin would reduce bacteriologically confirmed treatment failure or disease recurrence, or death, by enhancing early S aureus killing, sterilising infected foci and blood faster, and reducing risks of dissemination and metastatic infection. METHODS: In this multicentre, randomised, double-blind, placebo-controlled trial, adults (≥18 years) with S aureus bacteraemia who had received ≤96 h of active antibiotic therapy were recruited from 29 UK hospitals. Patients were randomly assigned (1:1) via a computer-generated sequential randomisation list to receive 2 weeks of adjunctive rifampicin (600 mg or 900 mg per day according to weight, oral or intravenous) versus identical placebo, together with standard antibiotic therapy. Randomisation was stratified by centre. Patients, investigators, and those caring for the patients were masked to group allocation. The primary outcome was time to bacteriologically confirmed treatment failure or disease recurrence, or death (all-cause), from randomisation to 12 weeks, adjudicated by an independent review committee masked to the treatment. Analysis was intention to treat. This trial was registered, number ISRCTN37666216, and is closed to new participants. FINDINGS: Between Dec 10, 2012, and Oct 25, 2016, 758 eligible participants were randomly assigned: 370 to rifampicin and 388 to placebo. 485 (64%) participants had community-acquired S aureus infections, and 132 (17%) had nosocomial S aureus infections. 47 (6%) had meticillin-resistant infections. 301 (40%) participants had an initial deep infection focus. Standard antibiotics were given for 29 (IQR 18-45) days; 619 (82%) participants received flucloxacillin. By week 12, 62 (17%) of participants who received rifampicin versus 71 (18%) who received placebo experienced treatment failure or disease recurrence, or died (absolute risk difference -1·4%, 95% CI -7·0 to 4·3; hazard ratio 0·96, 0·68-1·35, p=0·81). From randomisation to 12 weeks, no evidence of differences in serious (p=0·17) or grade 3-4 (p=0·36) adverse events were observed; however, 63 (17%) participants in the rifampicin group versus 39 (10%) in the placebo group had antibiotic or trial drug-modifying adverse events (p=0·004), and 24 (6%) versus six (2%) had drug interactions (p=0·0005). INTERPRETATION: Adjunctive rifampicin provided no overall benefit over standard antibiotic therapy in adults with S aureus bacteraemia. FUNDING: UK National Institute for Health Research Health Technology Assessment

    Implementation of corticosteroids in treating COVID-19 in the ISARIC WHO Clinical Characterisation Protocol UK:prospective observational cohort study

    Get PDF
    BACKGROUND: Dexamethasone was the first intervention proven to reduce mortality in patients with COVID-19 being treated in hospital. We aimed to evaluate the adoption of corticosteroids in the treatment of COVID-19 in the UK after the RECOVERY trial publication on June 16, 2020, and to identify discrepancies in care. METHODS: We did an audit of clinical implementation of corticosteroids in a prospective, observational, cohort study in 237 UK acute care hospitals between March 16, 2020, and April 14, 2021, restricted to patients aged 18 years or older with proven or high likelihood of COVID-19, who received supplementary oxygen. The primary outcome was administration of dexamethasone, prednisolone, hydrocortisone, or methylprednisolone. This study is registered with ISRCTN, ISRCTN66726260. FINDINGS: Between June 17, 2020, and April 14, 2021, 47 795 (75·2%) of 63 525 of patients on supplementary oxygen received corticosteroids, higher among patients requiring critical care than in those who received ward care (11 185 [86·6%] of 12 909 vs 36 415 [72·4%] of 50 278). Patients 50 years or older were significantly less likely to receive corticosteroids than those younger than 50 years (adjusted odds ratio 0·79 [95% CI 0·70–0·89], p=0·0001, for 70–79 years; 0·52 [0·46–0·58], p80 years), independent of patient demographics and illness severity. 84 (54·2%) of 155 pregnant women received corticosteroids. Rates of corticosteroid administration increased from 27·5% in the week before June 16, 2020, to 75–80% in January, 2021. INTERPRETATION: Implementation of corticosteroids into clinical practice in the UK for patients with COVID-19 has been successful, but not universal. Patients older than 70 years, independent of illness severity, chronic neurological disease, and dementia, were less likely to receive corticosteroids than those who were younger, as were pregnant women. This could reflect appropriate clinical decision making, but the possibility of inequitable access to life-saving care should be considered. FUNDING: UK National Institute for Health Research and UK Medical Research Council

    Impact of COVID-19 on cardiovascular testing in the United States versus the rest of the world

    Get PDF
    Objectives: This study sought to quantify and compare the decline in volumes of cardiovascular procedures between the United States and non-US institutions during the early phase of the coronavirus disease-2019 (COVID-19) pandemic. Background: The COVID-19 pandemic has disrupted the care of many non-COVID-19 illnesses. Reductions in diagnostic cardiovascular testing around the world have led to concerns over the implications of reduced testing for cardiovascular disease (CVD) morbidity and mortality. Methods: Data were submitted to the INCAPS-COVID (International Atomic Energy Agency Non-Invasive Cardiology Protocols Study of COVID-19), a multinational registry comprising 909 institutions in 108 countries (including 155 facilities in 40 U.S. states), assessing the impact of the COVID-19 pandemic on volumes of diagnostic cardiovascular procedures. Data were obtained for April 2020 and compared with volumes of baseline procedures from March 2019. We compared laboratory characteristics, practices, and procedure volumes between U.S. and non-U.S. facilities and between U.S. geographic regions and identified factors associated with volume reduction in the United States. Results: Reductions in the volumes of procedures in the United States were similar to those in non-U.S. facilities (68% vs. 63%, respectively; p = 0.237), although U.S. facilities reported greater reductions in invasive coronary angiography (69% vs. 53%, respectively; p < 0.001). Significantly more U.S. facilities reported increased use of telehealth and patient screening measures than non-U.S. facilities, such as temperature checks, symptom screenings, and COVID-19 testing. Reductions in volumes of procedures differed between U.S. regions, with larger declines observed in the Northeast (76%) and Midwest (74%) than in the South (62%) and West (44%). Prevalence of COVID-19, staff redeployments, outpatient centers, and urban centers were associated with greater reductions in volume in U.S. facilities in a multivariable analysis. Conclusions: We observed marked reductions in U.S. cardiovascular testing in the early phase of the pandemic and significant variability between U.S. regions. The association between reductions of volumes and COVID-19 prevalence in the United States highlighted the need for proactive efforts to maintain access to cardiovascular testing in areas most affected by outbreaks of COVID-19 infection

    Co-infections, secondary infections, and antimicrobial use in patients hospitalised with COVID-19 during the first pandemic wave from the ISARIC WHO CCP-UK study: a multicentre, prospective cohort study

    Get PDF
    Background: Microbiological characterisation of co-infections and secondary infections in patients with COVID-19 is lacking, and antimicrobial use is high. We aimed to describe microbiologically confirmed co-infections and secondary infections, and antimicrobial use, in patients admitted to hospital with COVID-19. Methods: The International Severe Acute Respiratory and Emerging Infections Consortium (ISARIC) WHO Clinical Characterisation Protocol UK (CCP-UK) study is an ongoing, prospective cohort study recruiting inpatients from 260 hospitals in England, Scotland, and Wales, conducted by the ISARIC Coronavirus Clinical Characterisation Consortium. Patients with a confirmed or clinician-defined high likelihood of SARS-CoV-2 infection were eligible for inclusion in the ISARIC WHO CCP-UK study. For this specific study, we excluded patients with a recorded negative SARS-CoV-2 test result and those without a recorded outcome at 28 days after admission. Demographic, clinical, laboratory, therapeutic, and outcome data were collected using a prespecified case report form. Organisms considered clinically insignificant were excluded. Findings: We analysed data from 48 902 patients admitted to hospital between Feb 6 and June 8, 2020. The median patient age was 74 years (IQR 59–84) and 20 786 (42·6%) of 48 765 patients were female. Microbiological investigations were recorded for 8649 (17·7%) of 48 902 patients, with clinically significant COVID-19-related respiratory or bloodstream culture results recorded for 1107 patients. 762 (70·6%) of 1080 infections were secondary, occurring more than 2 days after hospital admission. Staphylococcus aureus and Haemophilus influenzae were the most common pathogens causing respiratory co-infections (diagnosed ≤2 days after admission), with Enterobacteriaceae and S aureus most common in secondary respiratory infections. Bloodstream infections were most frequently caused by Escherichia coli and S aureus. Among patients with available data, 13 390 (37·0%) of 36 145 had received antimicrobials in the community for this illness episode before hospital admission and 39 258 (85·2%) of 46 061 patients with inpatient antimicrobial data received one or more antimicrobials at some point during their admission (highest for patients in critical care). We identified frequent use of broad-spectrum agents and use of carbapenems rather than carbapenem-sparing alternatives. Interpretation: In patients admitted to hospital with COVID-19, microbiologically confirmed bacterial infections are rare, and more likely to be secondary infections. Gram-negative organisms and S aureus are the predominant pathogens. The frequency and nature of antimicrobial use are concerning, but tractable targets for stewardship interventions exist. Funding: National Institute for Health Research (NIHR), UK Medical Research Council, Wellcome Trust, UK Department for International Development, Bill &amp; Melinda Gates Foundation, EU Platform for European Preparedness Against (Re-)emerging Epidemics, NIHR Health Protection Research Unit (HPRU) in Emerging and Zoonotic Infections at University of Liverpool, and NIHR HPRU in Respiratory Infections at Imperial College London
    corecore