81 research outputs found

    Can forest management based on natural disturbances maintain ecological resilience?

    Get PDF
    Given the increasingly global stresses on forests, many ecologists argue that managers must maintain ecological resilience: the capacity of ecosystems to absorb disturbances without undergoing fundamental change. In this review we ask: Can the emerging paradigm of natural-disturbance-based management (NDBM) maintain ecological resilience in managed forests? Applying resilience theory requires careful articulation of the ecosystem state under consideration, the disturbances and stresses that affect the persistence of possible alternative states, and the spatial and temporal scales of management relevance. Implementing NDBM while maintaining resilience means recognizing that (i) biodiversity is important for long-term ecosystem persistence, (ii) natural disturbances play a critical role as a generator of structural and compositional heterogeneity at multiple scales, and (iii) traditional management tends to produce forests more homogeneous than those disturbed naturally and increases the likelihood of unexpected catastrophic change by constraining variation of key environmental processes. NDBM may maintain resilience if silvicultural strategies retain the structures and processes that perpetuate desired states while reducing those that enhance resilience of undesirable states. Such strategies require an understanding of harvesting impacts on slow ecosystem processes, such as seed-bank or nutrient dynamics, which in the long term can lead to ecological surprises by altering the forest's capacity to reorganize after disturbance

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: A systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods: We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings: Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation: Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding: Bill & Melinda Gates Foundation

    Altered TMPRSS2 usage by SARS-CoV-2 Omicron impacts infectivity and fusogenicity

    Get PDF
    The SARS-CoV-2 Omicron BA.1 variant emerged in 20211 and has multiple mutations in its spike protein2. Here we show that the spike protein of Omicron has a higher affinity for ACE2 compared with Delta, and a marked change in its antigenicity increases Omicron’s evasion of therapeutic monoclonal and vaccine-elicited polyclonal neutralizing antibodies after two doses. mRNA vaccination as a third vaccine dose rescues and broadens neutralization. Importantly, the antiviral drugs remdesivir and molnupiravir retain efficacy against Omicron BA.1. Replication was similar for Omicron and Delta virus isolates in human nasal epithelial cultures. However, in lung cells and gut cells, Omicron demonstrated lower replication. Omicron spike protein was less efficiently cleaved compared with Delta. The differences in replication were mapped to the entry efficiency of the virus on the basis of spike-pseudotyped virus assays. The defect in entry of Omicron pseudotyped virus to specific cell types effectively correlated with higher cellular RNA expression of TMPRSS2, and deletion of TMPRSS2 affected Delta entry to a greater extent than Omicron. Furthermore, drug inhibitors targeting specific entry pathways3 demonstrated that the Omicron spike inefficiently uses the cellular protease TMPRSS2, which promotes cell entry through plasma membrane fusion, with greater dependency on cell entry through the endocytic pathway. Consistent with suboptimal S1/S2 cleavage and inability to use TMPRSS2, syncytium formation by the Omicron spike was substantially impaired compared with the Delta spike. The less efficient spike cleavage of Omicron at S1/S2 is associated with a shift in cellular tropism away from TMPRSS2-expressing cells, with implications for altered pathogenesis

    The Cholecystectomy As A Day Case (CAAD) Score: A Validated Score of Preoperative Predictors of Successful Day-Case Cholecystectomy Using the CholeS Data Set

    Get PDF
    Background Day-case surgery is associated with significant patient and cost benefits. However, only 43% of cholecystectomy patients are discharged home the same day. One hypothesis is day-case cholecystectomy rates, defined as patients discharged the same day as their operation, may be improved by better assessment of patients using standard preoperative variables. Methods Data were extracted from a prospectively collected data set of cholecystectomy patients from 166 UK and Irish hospitals (CholeS). Cholecystectomies performed as elective procedures were divided into main (75%) and validation (25%) data sets. Preoperative predictors were identified, and a risk score of failed day case was devised using multivariate logistic regression. Receiver operating curve analysis was used to validate the score in the validation data set. Results Of the 7426 elective cholecystectomies performed, 49% of these were discharged home the same day. Same-day discharge following cholecystectomy was less likely with older patients (OR 0.18, 95% CI 0.15–0.23), higher ASA scores (OR 0.19, 95% CI 0.15–0.23), complicated cholelithiasis (OR 0.38, 95% CI 0.31 to 0.48), male gender (OR 0.66, 95% CI 0.58–0.74), previous acute gallstone-related admissions (OR 0.54, 95% CI 0.48–0.60) and preoperative endoscopic intervention (OR 0.40, 95% CI 0.34–0.47). The CAAD score was developed using these variables. When applied to the validation subgroup, a CAAD score of ≤5 was associated with 80.8% successful day-case cholecystectomy compared with 19.2% associated with a CAAD score >5 (p < 0.001). Conclusions The CAAD score which utilises data readily available from clinic letters and electronic sources can predict same-day discharges following cholecystectomy

    Adding 6 months of androgen deprivation therapy to postoperative radiotherapy for prostate cancer: a comparison of short-course versus no androgen deprivation therapy in the RADICALS-HD randomised controlled trial

    Get PDF
    Background Previous evidence indicates that adjuvant, short-course androgen deprivation therapy (ADT) improves metastasis-free survival when given with primary radiotherapy for intermediate-risk and high-risk localised prostate cancer. However, the value of ADT with postoperative radiotherapy after radical prostatectomy is unclear. Methods RADICALS-HD was an international randomised controlled trial to test the efficacy of ADT used in combination with postoperative radiotherapy for prostate cancer. Key eligibility criteria were indication for radiotherapy after radical prostatectomy for prostate cancer, prostate-specific antigen less than 5 ng/mL, absence of metastatic disease, and written consent. Participants were randomly assigned (1:1) to radiotherapy alone (no ADT) or radiotherapy with 6 months of ADT (short-course ADT), using monthly subcutaneous gonadotropin-releasing hormone analogue injections, daily oral bicalutamide monotherapy 150 mg, or monthly subcutaneous degarelix. Randomisation was done centrally through minimisation with a random element, stratified by Gleason score, positive margins, radiotherapy timing, planned radiotherapy schedule, and planned type of ADT, in a computerised system. The allocated treatment was not masked. The primary outcome measure was metastasis-free survival, defined as distant metastasis arising from prostate cancer or death from any cause. Standard survival analysis methods were used, accounting for randomisation stratification factors. The trial had 80% power with two-sided α of 5% to detect an absolute increase in 10-year metastasis-free survival from 80% to 86% (hazard ratio [HR] 0·67). Analyses followed the intention-to-treat principle. The trial is registered with the ISRCTN registry, ISRCTN40814031, and ClinicalTrials.gov, NCT00541047. Findings Between Nov 22, 2007, and June 29, 2015, 1480 patients (median age 66 years [IQR 61–69]) were randomly assigned to receive no ADT (n=737) or short-course ADT (n=743) in addition to postoperative radiotherapy at 121 centres in Canada, Denmark, Ireland, and the UK. With a median follow-up of 9·0 years (IQR 7·1–10·1), metastasis-free survival events were reported for 268 participants (142 in the no ADT group and 126 in the short-course ADT group; HR 0·886 [95% CI 0·688–1·140], p=0·35). 10-year metastasis-free survival was 79·2% (95% CI 75·4–82·5) in the no ADT group and 80·4% (76·6–83·6) in the short-course ADT group. Toxicity of grade 3 or higher was reported for 121 (17%) of 737 participants in the no ADT group and 100 (14%) of 743 in the short-course ADT group (p=0·15), with no treatment-related deaths. Interpretation Metastatic disease is uncommon following postoperative bed radiotherapy after radical prostatectomy. Adding 6 months of ADT to this radiotherapy did not improve metastasis-free survival compared with no ADT. These findings do not support the use of short-course ADT with postoperative radiotherapy in this patient population

    Diurnal Variation of the Cough Response to Inhaled Citric Acid

    No full text
    corecore