19 research outputs found

    Fire ecology and fire management in the Southern Appalachians: rationale for and effects of prescribed fire

    Get PDF
    Fire suppression in the Southern Appalachians has led to changes in forests dominated by yellow pine (Pinus subgenus pinus) and oak (Quercus) species. Recently, management agencies have begun to prescribe fire with the aim of restoring pre-suppression conditions. Here, I examine the use of prescribed fire in the Southern Appalachians from two perspectives. First, I review the values and goals that underlie fire management, how they apply in the Southern Appalachians, and what the implications of these are for fire management planning. Second, I use long-term monitoring data to examine how prescribed fire affects forest structure and composition in the Great Smoky Mountains National Park and how these effects vary with environment and fire severity. I find that prescribed fire creates conditions conducive for pine reproduction and is particularly effective at high severity and at lower elevation sites where fire sensitive species are still confined to smaller size classes.Master of Scienc

    Land-use dynamics influence estimates of carbon sequestration potential in tropical second-growth forest

    Get PDF
    Many countries have made major commitments to carbon sequestration through reforestation under the Paris Climate Agreement, and recent studies have illustrated the potential for large amounts of carbon sequestration in tropical second-growth forests. However, carbon gains in second-growth forests are threatened by non-permanence, i.e. release of carbon into the atmosphere from clearing or disturbance. The benefits of second-growth forests require long-term persistence on the landscape, but estimates of carbon potential rarely consider the spatio-temporal landscape dynamics of second-growth forests. In this study, we used remotely sensed imagery from a landscape in the Peruvian Amazon to examine patterns of second-growth forest regrowth and permanence over 28 years (1985–2013). By 2013, 44% of all forest cover in the study area was second growth and more than 50% of second-growth forest pixels were less than 5 years old. We modeled probabilities of forest regrowth and clearing as a function of landscape factors. The amount of neighboring forest and variables related to pixel position (i.e. distance to edge) were important for predicting both clearing and regrowth. Forest age was the strongest predictor of clearing probability and suggests a threshold response of clearing probability to age. Finally, we simulated future trajectories of carbon sequestration using the parameters from our models. We compared this with the amount of biomass that would accumulate under the assumption of second-growth permanence. Estimates differed by 900 000 tonnes, equivalent to over 80% of Peru's commitment to carbon sequestration through 'community reforestation' under the Paris Agreement. Though the study area has more than 40 000 hectares of second-growth forest, only a small proportion is likely to accumulate significant carbon. Instead, cycles between forest and non-forest are common. Our results illustrate the importance of considering landscape dynamics when assessing the carbon sequestration potential of second-growth forests

    GENDER AND SOCIAL MOVEMENTS

    No full text

    Risk of COVID-19 after natural infection or vaccinationResearch in context

    No full text
    Summary: Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health

    A Bayesian reanalysis of the Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial

    No full text
    Background Timing of initiation of kidney-replacement therapy (KRT) in critically ill patients remains controversial. The Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial compared two strategies of KRT initiation (accelerated versus standard) in critically ill patients with acute kidney injury and found neutral results for 90-day all-cause mortality. Probabilistic exploration of the trial endpoints may enable greater understanding of the trial findings. We aimed to perform a reanalysis using a Bayesian framework. Methods We performed a secondary analysis of all 2927 patients randomized in multi-national STARRT-AKI trial, performed at 168 centers in 15 countries. The primary endpoint, 90-day all-cause mortality, was evaluated using hierarchical Bayesian logistic regression. A spectrum of priors includes optimistic, neutral, and pessimistic priors, along with priors informed from earlier clinical trials. Secondary endpoints (KRT-free days and hospital-free days) were assessed using zero–one inflated beta regression. Results The posterior probability of benefit comparing an accelerated versus a standard KRT initiation strategy for the primary endpoint suggested no important difference, regardless of the prior used (absolute difference of 0.13% [95% credible interval [CrI] − 3.30%; 3.40%], − 0.39% [95% CrI − 3.46%; 3.00%], and 0.64% [95% CrI − 2.53%; 3.88%] for neutral, optimistic, and pessimistic priors, respectively). There was a very low probability that the effect size was equal or larger than a consensus-defined minimal clinically important difference. Patients allocated to the accelerated strategy had a lower number of KRT-free days (median absolute difference of − 3.55 days [95% CrI − 6.38; − 0.48]), with a probability that the accelerated strategy was associated with more KRT-free days of 0.008. Hospital-free days were similar between strategies, with the accelerated strategy having a median absolute difference of 0.48 more hospital-free days (95% CrI − 1.87; 2.72) compared with the standard strategy and the probability that the accelerated strategy had more hospital-free days was 0.66. Conclusions In a Bayesian reanalysis of the STARRT-AKI trial, we found very low probability that an accelerated strategy has clinically important benefits compared with the standard strategy. Patients receiving the accelerated strategy probably have fewer days alive and KRT-free. These findings do not support the adoption of an accelerated strategy of KRT initiation

    Initiation of continuous renal replacement therapy versus intermittent hemodialysis in critically ill patients with severe acute kidney injury: a secondary analysis of STARRT-AKI trial

    No full text
    Background: There is controversy regarding the optimal renal-replacement therapy (RRT) modality for critically ill patients with acute kidney injury (AKI). Methods: We conducted a secondary analysis of the STandard versus Accelerated Renal Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial to compare outcomes among patients who initiated RRT with either continuous renal replacement therapy (CRRT) or intermittent hemodialysis (IHD). We generated a propensity score for the likelihood of receiving CRRT and used inverse probability of treatment with overlap-weighting to address baseline inter-group differences. The primary outcome was a composite of death or RRT dependence at 90-days after randomization. Results: We identified 1590 trial participants who initially received CRRT and 606 who initially received IHD. The composite outcome of death or RRT dependence at 90-days occurred in 823 (51.8%) patients who commenced CRRT and 329 (54.3%) patients who commenced IHD (unadjusted odds ratio (OR) 0.90; 95% confidence interval (CI) 0.75-1.09). After balancing baseline characteristics with overlap weighting, initial receipt of CRRT was associated with a lower risk of death or RRT dependence at 90-days compared with initial receipt of IHD (OR 0.81; 95% CI 0.66-0.99). This association was predominantly driven by a lower risk of RRT dependence at 90-days (OR 0.61; 95% CI 0.39-0.94). Conclusions: In critically ill patients with severe AKI, initiation of CRRT, as compared to IHD, was associated with a significant reduction in the composite outcome of death or RRT dependence at 90-days
    corecore