24 research outputs found

    Drivers of understory plant communities in Sierra Nevada mixed conifer forests with pyrodiversity

    Get PDF
    Background: Fire suppression in western North America increased and homogenized overstory cover in conifer forests, which likely affected understory plant communities. We sought to characterize understory plant communities and their drivers using plot-based observations from two contemporary reference sites in the Sierra Nevada, USA. These sites had long-established natural fire programs, which have resulted in restored natural fire regimes. In this study, we investigated how pyrodiversity—the diversity of fire size, severity, season, and frequency—and other environment factors influenced species composition and cover of forest understory plant communities. Results: Understory plant communities were influenced by a combination of environmental, plot-scale recent fire history, and plot-neighborhood pyrodiversity within 50 m. Canopy cover was inversely proportional to understory plant cover, Simpson’s diversity, and evenness. Species richness was strongly influenced by the interaction of plot-based fire experience and plot-neighborhood pyrodiversity within 50 m. Conclusions: Pyrodiversity appears to contribute both directly and indirectly to diverse understory plant communities in Sierra Nevada mixed conifer forests. The indirect influence is mediated through variability in tree canopy cover, which is partially related to variation in fire severity, while direct influence is an interaction between local and neighborhood fire activity

    Fire, water, and biodiversity in the sierra nevada: A possible triple win

    Get PDF
    Reducing the risk of large, severe wildfires while also increasing the security of mountain water supplies and enhancing biodiversity are urgent priorities in western US forests. After a century of fire suppression, Yosemite and Sequoia-Kings Canyon National Parks located in California’s Sierra Nevada initiated programs to manage wildfires and these areas present a rare opportunity to study the effects of restored fire regimes. Forest cover decreased during the managed wildfire period and meadow and shrubland cover increased, especially in Yosemite’s Illilouette Creek basin that experienced a 20% reduction in forest area. These areas now support greater pyrodiversity and consequently greater landscape and species diversity. Soil moisture increased and drought-induced tree mortality decreased, especially in Illilouette where wildfires have been allowed to burn more freely resulting in a 30% increase in summer soil moisture. Modeling suggests that the ecohydrological co-benefits of restoring fire regimes are robust to the projected climatic warming. Support will be needed from the highest levels of government and the public to maintain existing programs and expand them to other forested areas

    South Georgia marine productivity over the past 15 ka and implications for glacial evolution

    Get PDF
    The subantarctic islands of South Georgia are located in the Southern Ocean, and they may be sensitive to future climate warming. However, due to a lack of well-dated subantarctic palaeoclimate archives, there is still uncertainty about South Georgia’s response to past climate change. Here, we reconstruct primary productivity changes and infer Holocene glacial evolution by analysing two marine gravity cores: one near Cumberland Bay on the inner South Georgia shelf (GC673: ca. 9.5 to 0.3cal.kyrBP) and one offshore of Royal Bay on the mid-shelf (GC666: ca. 15.2cal.kyrBP to present). We identify three distinct benthic foraminiferal assemblages characterised by the dominance of Miliammina earlandi, Fursenkoina fusiformis, and Cassidulinoides parkerianus that are considered alongside foraminiferal stable isotopes and the organic carbon and biogenic silica accumulation rates of the host sediment. The M. earlandi assemblage is prevalent during intervals of dissolution in GC666 and reduced productivity in GC673. The F. fusiformis assemblage coincides with enhanced productivity in both cores. Our multiproxy analysis provides evidence that the latest Pleistocene to earliest Holocene (ca. 15.2 to 10.5cal.kyrBP) was a period of high productivity associated with increased glacial meltwater discharge. The mid–late Holocene (ca. 8 to 1cal.kyrBP), coinciding with a fall in sedimentation rates and lower productivity, was likely a period of reduced glacial extent but with several short-lived episodes of increased productivity from minor glacial readvances. The latest Holocene (from ca. 1cal.kyrBP) saw an increase in productivity and glacial advance associated with cooling temperatures and increased precipitation which may have been influenced by changes in the southwesterly winds over South Georgia. We interpret the elevated relative abundance of F. fusiformis as a proxy for increased primary productivity which, at proximal site GC673, was forced by terrestrial runoff associated with the spring–summer melting of glaciers in Cumberland Bay. Our study refines the glacial history of South Georgia and provides a more complete record of mid–late Holocene glacial readvances with robust chronology. Our results suggest that South Georgia glaciers were sensitive to modest climate changes within the Holocene

    ​​South Georgia marine productivity over the past 15 ka and implications for glacial evolution​

    Get PDF
    The subantarctic islands of South Georgia are located in the Southern Ocean, and they may be sensitive to future climate warming. However, due to a lack of well-dated subantarctic palaeoclimate archives, there is still uncertainty about South Georgia's response to past climate change. Here, we reconstruct primary productivity changes and infer Holocene glacial evolution by analysing two marine gravity cores: one near Cumberland Bay on the inner South Georgia shelf (GC673: ca. 9.5 to 0.3 cal. kyr BP) and one offshore of Royal Bay on the mid-shelf (GC666: ca. 15.2 cal. kyr BP to present). We identify three distinct benthic foraminiferal assemblages characterised by the dominance of Miliammina earlandi, Fursenkoina fusiformis, and Cassidulinoides parkerianus that are considered alongside foraminiferal stable isotopes and the organic carbon and biogenic silica accumulation rates of the host sediment. The M. earlandi assemblage is prevalent during intervals of dissolution in GC666 and reduced productivity in GC673. The F. fusiformis assemblage coincides with enhanced productivity in both cores. Our multiproxy analysis provides evidence that the latest Pleistocene to earliest Holocene (ca. 15.2 to 10.5 cal. kyr BP) was a period of high productivity associated with increased glacial meltwater discharge. The mid–late Holocene (ca. 8 to 1 cal. kyr BP), coinciding with a fall in sedimentation rates and lower productivity, was likely a period of reduced glacial extent but with several short-lived episodes of increased productivity from minor glacial readvances. The latest Holocene (from ca. 1 cal. kyr BP) saw an increase in productivity and glacial advance associated with cooling temperatures and increased precipitation which may have been influenced by changes in the southwesterly winds over South Georgia. We interpret the elevated relative abundance of F. fusiformis as a proxy for increased primary productivity which, at proximal site GC673, was forced by terrestrial runoff associated with the spring–summer melting of glaciers in Cumberland Bay. Our study refines the glacial history of South Georgia and provides a more complete record of mid–late Holocene glacial readvances with robust chronology. Our results suggest that South Georgia glaciers were sensitive to modest climate changes within the Holocene

    Circulating microRNAs in sera correlate with soluble biomarkers of immune activation but do not predict mortality in ART treated individuals with HIV-1 infection: A case control study

    Get PDF
    Introduction: The use of anti-retroviral therapy (ART) has dramatically reduced HIV-1 associated morbidity and mortality. However, HIV-1 infected individuals have increased rates of morbidity and mortality compared to the non-HIV-1 infected population and this appears to be related to end-organ diseases collectively referred to as Serious Non-AIDS Events (SNAEs). Circulating miRNAs are reported as promising biomarkers for a number of human disease conditions including those that constitute SNAEs. Our study sought to investigate the potential of selected miRNAs in predicting mortality in HIV-1 infected ART treated individuals. Materials and Methods: A set of miRNAs was chosen based on published associations with human disease conditions that constitute SNAEs. This case: control study compared 126 cases (individuals who died whilst on therapy), and 247 matched controls (individuals who remained alive). Cases and controls were ART treated participants of two pivotal HIV-1 trials. The relative abundance of each miRNA in serum was measured, by RTqPCR. Associations with mortality (all-cause, cardiovascular and malignancy) were assessed by logistic regression analysis. Correlations between miRNAs and CD4+ T cell count, hs-CRP, IL-6 and D-dimer were also assessed. Results: None of the selected miRNAs was associated with all-cause, cardiovascular or malignancy mortality. The levels of three miRNAs (miRs -21, -122 and -200a) correlated with IL-6 while miR-21 also correlated with D-dimer. Additionally, the abundance of miRs -31, -150 and -223, correlated with baseline CD4+ T cell count while the same three miRNAs plus miR- 145 correlated with nadir CD4+ T cell count. Discussion: No associations with mortality were found with any circulating miRNA studied. These results cast doubt onto the effectiveness of circulating miRNA as early predictors of mortality or the major underlying diseases that contribute to mortality in participants treated for HIV-1 infection

    Mortality and pulmonary complications in patients undergoing surgery with perioperative SARS-CoV-2 infection: an international cohort study

    Get PDF
    Background: The impact of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on postoperative recovery needs to be understood to inform clinical decision making during and after the COVID-19 pandemic. This study reports 30-day mortality and pulmonary complication rates in patients with perioperative SARS-CoV-2 infection. Methods: This international, multicentre, cohort study at 235 hospitals in 24 countries included all patients undergoing surgery who had SARS-CoV-2 infection confirmed within 7 days before or 30 days after surgery. The primary outcome measure was 30-day postoperative mortality and was assessed in all enrolled patients. The main secondary outcome measure was pulmonary complications, defined as pneumonia, acute respiratory distress syndrome, or unexpected postoperative ventilation. Findings: This analysis includes 1128 patients who had surgery between Jan 1 and March 31, 2020, of whom 835 (74·0%) had emergency surgery and 280 (24·8%) had elective surgery. SARS-CoV-2 infection was confirmed preoperatively in 294 (26·1%) patients. 30-day mortality was 23·8% (268 of 1128). Pulmonary complications occurred in 577 (51·2%) of 1128 patients; 30-day mortality in these patients was 38·0% (219 of 577), accounting for 81·7% (219 of 268) of all deaths. In adjusted analyses, 30-day mortality was associated with male sex (odds ratio 1·75 [95% CI 1·28–2·40], p\textless0·0001), age 70 years or older versus younger than 70 years (2·30 [1·65–3·22], p\textless0·0001), American Society of Anesthesiologists grades 3–5 versus grades 1–2 (2·35 [1·57–3·53], p\textless0·0001), malignant versus benign or obstetric diagnosis (1·55 [1·01–2·39], p=0·046), emergency versus elective surgery (1·67 [1·06–2·63], p=0·026), and major versus minor surgery (1·52 [1·01–2·31], p=0·047). Interpretation: Postoperative pulmonary complications occur in half of patients with perioperative SARS-CoV-2 infection and are associated with high mortality. Thresholds for surgery during the COVID-19 pandemic should be higher than during normal practice, particularly in men aged 70 years and older. Consideration should be given for postponing non-urgent procedures and promoting non-operative treatment to delay or avoid the need for surgery. Funding: National Institute for Health Research (NIHR), Association of Coloproctology of Great Britain and Ireland, Bowel and Cancer Research, Bowel Disease Research Foundation, Association of Upper Gastrointestinal Surgeons, British Association of Surgical Oncology, British Gynaecological Cancer Society, European Society of Coloproctology, NIHR Academy, Sarcoma UK, Vascular Society for Great Britain and Ireland, and Yorkshire Cancer Research

    Observation of gravitational waves from the coalescence of a 2.5−4.5 M⊙ compact object and a neutron star

    Get PDF

    Reducing the environmental impact of surgery on a global scale: systematic review and co-prioritization with healthcare workers in 132 countries

    Get PDF
    Abstract Background Healthcare cannot achieve net-zero carbon without addressing operating theatres. The aim of this study was to prioritize feasible interventions to reduce the environmental impact of operating theatres. Methods This study adopted a four-phase Delphi consensus co-prioritization methodology. In phase 1, a systematic review of published interventions and global consultation of perioperative healthcare professionals were used to longlist interventions. In phase 2, iterative thematic analysis consolidated comparable interventions into a shortlist. In phase 3, the shortlist was co-prioritized based on patient and clinician views on acceptability, feasibility, and safety. In phase 4, ranked lists of interventions were presented by their relevance to high-income countries and low–middle-income countries. Results In phase 1, 43 interventions were identified, which had low uptake in practice according to 3042 professionals globally. In phase 2, a shortlist of 15 intervention domains was generated. In phase 3, interventions were deemed acceptable for more than 90 per cent of patients except for reducing general anaesthesia (84 per cent) and re-sterilization of ‘single-use’ consumables (86 per cent). In phase 4, the top three shortlisted interventions for high-income countries were: introducing recycling; reducing use of anaesthetic gases; and appropriate clinical waste processing. In phase 4, the top three shortlisted interventions for low–middle-income countries were: introducing reusable surgical devices; reducing use of consumables; and reducing the use of general anaesthesia. Conclusion This is a step toward environmentally sustainable operating environments with actionable interventions applicable to both high– and low–middle–income countries

    Development and Validation of a Risk Score for Chronic Kidney Disease in HIV Infection Using Prospective Cohort Data from the D:A:D Study

    Get PDF
    Ristola M. on työryhmien DAD Study Grp ; Royal Free Hosp Clin Cohort ; INSIGHT Study Grp ; SMART Study Grp ; ESPRIT Study Grp jäsen.Background Chronic kidney disease (CKD) is a major health issue for HIV-positive individuals, associated with increased morbidity and mortality. Development and implementation of a risk score model for CKD would allow comparison of the risks and benefits of adding potentially nephrotoxic antiretrovirals to a treatment regimen and would identify those at greatest risk of CKD. The aims of this study were to develop a simple, externally validated, and widely applicable long-term risk score model for CKD in HIV-positive individuals that can guide decision making in clinical practice. Methods and Findings A total of 17,954 HIV-positive individuals from the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study with >= 3 estimated glomerular filtration rate (eGFR) values after 1 January 2004 were included. Baseline was defined as the first eGFR > 60 ml/min/1.73 m2 after 1 January 2004; individuals with exposure to tenofovir, atazanavir, atazanavir/ritonavir, lopinavir/ritonavir, other boosted protease inhibitors before baseline were excluded. CKD was defined as confirmed (>3 mo apart) eGFR In the D:A:D study, 641 individuals developed CKD during 103,185 person-years of follow-up (PYFU; incidence 6.2/1,000 PYFU, 95% CI 5.7-6.7; median follow-up 6.1 y, range 0.3-9.1 y). Older age, intravenous drug use, hepatitis C coinfection, lower baseline eGFR, female gender, lower CD4 count nadir, hypertension, diabetes, and cardiovascular disease (CVD) predicted CKD. The adjusted incidence rate ratios of these nine categorical variables were scaled and summed to create the risk score. The median risk score at baseline was -2 (interquartile range -4 to 2). There was a 1: 393 chance of developing CKD in the next 5 y in the low risk group (risk score = 5, 505 events), respectively. Number needed to harm (NNTH) at 5 y when starting unboosted atazanavir or lopinavir/ritonavir among those with a low risk score was 1,702 (95% CI 1,166-3,367); NNTH was 202 (95% CI 159-278) and 21 (95% CI 19-23), respectively, for those with a medium and high risk score. NNTH was 739 (95% CI 506-1462), 88 (95% CI 69-121), and 9 (95% CI 8-10) for those with a low, medium, and high risk score, respectively, starting tenofovir, atazanavir/ritonavir, or another boosted protease inhibitor. The Royal Free Hospital Clinic Cohort included 2,548 individuals, of whom 94 individuals developed CKD (3.7%) during 18,376 PYFU (median follow-up 7.4 y, range 0.3-12.7 y). Of 2,013 individuals included from the SMART/ESPRIT control arms, 32 individuals developed CKD (1.6%) during 8,452 PYFU (median follow-up 4.1 y, range 0.6-8.1 y). External validation showed that the risk score predicted well in these cohorts. Limitations of this study included limited data on race and no information on proteinuria. Conclusions Both traditional and HIV-related risk factors were predictive of CKD. These factors were used to develop a risk score for CKD in HIV infection, externally validated, that has direct clinical relevance for patients and clinicians to weigh the benefits of certain antiretrovirals against the risk of CKD and to identify those at greatest risk of CKD.Peer reviewe
    corecore