19 research outputs found
Recommended from our members
Earth, wind, water, fire: Interactions between land-use and natural disturbance in tropical second-growth forest landscapes
Climate models predict changes to the frequency and intensity of extreme events, with large effects on tropical forests likely. Predicting these impacts requires understanding how landscape configuration and land-use change influence the susceptibility of forests to disturbances such as wind, drought, and fire. This is important because most tropical forests are regenerating from anthropogenic disturbance, and are located in landscape mosaics of forest, agriculture, and other land use. This dissertation consists of four chapters that combine remote sensing and field data to examine causes and consequences of disturbance and land-use change in tropical second-growth forests. In Chapter 1, I use satellite data to identify factors associated with permanence of second-growth forest, and assess how estimates of carbon sequestration vary under different assumptions about second-growth forest permanence. I show that most second-growth forest is cleared when young, limiting carbon sequestration. In Chapter 2, I combine data from weather stations, remote sensing, and landowner surveys to model fire activity on 732 farms in the study area over ten years. The relative importance of these factors differs across scales and depending on the metric of fire activity being considered, illustrating how implications for fire prevention and mitigation can be different depending on the metric considered. Chapter 3 combines Landsat imagery and field data to map wind damage from a severe convective storm, providing strong empirical evidence that vulnerability to wind disturbance is elevated in tropical forest fragments. Finally, in Chapter 4 I integrate annual forest census data with LiDAR-derived topography metrics and tree functional traits in a hierarchical Bayesian modeling framework to explore how drought, topography, and neighborhood crowding affect tree growth, and how functional traits modulate those effects. The results from these studies demonstrate innovative approaches to understanding spatial variation in forest vulnerability to disturbance at multiple scales, and the results have implications for managing forests in a changing climate
Fire ecology and fire management in the Southern Appalachians: rationale for and effects of prescribed fire
Fire suppression in the Southern Appalachians has led to changes in forests dominated by yellow pine (Pinus subgenus pinus) and oak (Quercus) species. Recently, management agencies have begun to prescribe fire with the aim of restoring pre-suppression conditions. Here, I examine the use of prescribed fire in the Southern Appalachians from two perspectives. First, I review the values and goals that underlie fire management, how they apply in the Southern Appalachians, and what the implications of these are for fire management planning. Second, I use long-term monitoring data to examine how prescribed fire affects forest structure and composition in the Great Smoky Mountains National Park and how these effects vary with environment and fire severity. I find that prescribed fire creates conditions conducive for pine reproduction and is particularly effective at high severity and at lower elevation sites where fire sensitive species are still confined to smaller size classes.Master of Scienc
Land-use dynamics influence estimates of carbon sequestration potential in tropical second-growth forest
Many countries have made major commitments to carbon sequestration through reforestation under the Paris Climate Agreement, and recent studies have illustrated the potential for large amounts of carbon sequestration in tropical second-growth forests. However, carbon gains in second-growth forests are threatened by non-permanence, i.e. release of carbon into the atmosphere from clearing or disturbance. The benefits of second-growth forests require long-term persistence on the landscape, but estimates of carbon potential rarely consider the spatio-temporal landscape dynamics of second-growth forests. In this study, we used remotely sensed imagery from a landscape in the Peruvian Amazon to examine patterns of second-growth forest regrowth and permanence over 28 years (1985–2013). By 2013, 44% of all forest cover in the study area was second growth and more than 50% of second-growth forest pixels were less than 5 years old. We modeled probabilities of forest regrowth and clearing as a function of landscape factors. The amount of neighboring forest and variables related to pixel position (i.e. distance to edge) were important for predicting both clearing and regrowth. Forest age was the strongest predictor of clearing probability and suggests a threshold response of clearing probability to age. Finally, we simulated future trajectories of carbon sequestration using the parameters from our models. We compared this with the amount of biomass that would accumulate under the assumption of second-growth permanence. Estimates differed by 900 000 tonnes, equivalent to over 80% of Peru's commitment to carbon sequestration through 'community reforestation' under the Paris Agreement. Though the study area has more than 40 000 hectares of second-growth forest, only a small proportion is likely to accumulate significant carbon. Instead, cycles between forest and non-forest are common. Our results illustrate the importance of considering landscape dynamics when assessing the carbon sequestration potential of second-growth forests
Recommended from our members
Ephemeral forest regeneration limits carbon sequestration potential in the Brazilian Atlantic Forest
Although deforestation remains widespread in the tropics, many places are now experiencing significant forest recovery (i.e., forest transition), offering an optimistic outlook for natural ecosystem recovery and carbon sequestration. Naturally regenerated forests, however, may not persist, so a more nuanced understanding of the drivers of forest change in the tropics is critical to ensure the success of reforestation efforts and carbon sequestration targets. Here we use 35 years of detailed land cover data to investigate forest trajectories in 3014 municipalities in the Brazilian Atlantic Forest (AF), a biodiversity and conservation hotspot. Although deforestation was evident in some regions, deforestation reversals, the typical forest transition trajectory, were the prevalent trend in the AF, accounting for 38% of municipalities. However, simultaneous reforestation reversals in the region (13% of municipalities) suggest that these short-term increases in native forest cover do not necessarily translate into persistent trends. In the absence of reversals in reforestation, forests in the region could have sequestered 1.75 Pg C, over three times the actual estimated carbon sequestration (0.52 Pg C). We also showed that failure to distinguish native and planted forests would have masked native forest cover loss in the region and overestimated reforestation by 3.2 Mha and carbon sequestration from natural forest regeneration by 0.37 Pg C. Deforestation reversals were prevalent in urbanized municipalities with limited forest cover and high agricultural productivity, highlighting the importance of favorable socioeconomic conditions in promoting reforestation. Successful forest restoration efforts will require development and enforcement of environmental policies that promote forest regeneration and ensure the permanence of regrowing forests. This is crucial not only for the fate and conservation of the AF, but also for other tropical nations to achieve their restoration and carbon sequestration commitments
Recommended from our members
Data from: Fragmentation increases wind disturbance impacts on forest structure and carbon stocks in a Western Amazonian landscape
This file contains the raw data, non-spatial data used for the analysis of fragmentation effects on wind damage in Schwartz et al. 2017. For spatially explicit data, please contact the author. Variable names are as follows: PID = ID of patch in which pixel is located, dam = pixel wind damage (deltaNPV), dist.to.edge = pixel distance to edge, secondary = binary indicator for secondary forest (0 = old growth, 1 = secondary), AREA = patch size, SHAPE = shape index (i.e. edginess) from fragstats, PROX = proximity index from fragstats*-1 (i.e. isolation
Risk of COVID-19 after natural infection or vaccinationResearch in context
Summary: Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health
Recommended from our members
Risk of COVID-19 after natural infection or vaccinationResearch in context
Background: While vaccines have established utility against COVID-19, phase 3 efficacy studies have generally not comprehensively evaluated protection provided by previous infection or hybrid immunity (previous infection plus vaccination). Individual patient data from US government-supported harmonized vaccine trials provide an unprecedented sample population to address this issue. We characterized the protective efficacy of previous SARS-CoV-2 infection and hybrid immunity against COVID-19 early in the pandemic over three-to six-month follow-up and compared with vaccine-associated protection. Methods: In this post-hoc cross-protocol analysis of the Moderna, AstraZeneca, Janssen, and Novavax COVID-19 vaccine clinical trials, we allocated participants into four groups based on previous-infection status at enrolment and treatment: no previous infection/placebo; previous infection/placebo; no previous infection/vaccine; and previous infection/vaccine. The main outcome was RT-PCR-confirmed COVID-19 >7–15 days (per original protocols) after final study injection. We calculated crude and adjusted efficacy measures. Findings: Previous infection/placebo participants had a 92% decreased risk of future COVID-19 compared to no previous infection/placebo participants (overall hazard ratio [HR] ratio: 0.08; 95% CI: 0.05–0.13). Among single-dose Janssen participants, hybrid immunity conferred greater protection than vaccine alone (HR: 0.03; 95% CI: 0.01–0.10). Too few infections were observed to draw statistical inferences comparing hybrid immunity to vaccine alone for other trials. Vaccination, previous infection, and hybrid immunity all provided near-complete protection against severe disease. Interpretation: Previous infection, any hybrid immunity, and two-dose vaccination all provided substantial protection against symptomatic and severe COVID-19 through the early Delta period. Thus, as a surrogate for natural infection, vaccination remains the safest approach to protection. Funding: National Institutes of Health
A Bayesian reanalysis of the Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial
Background
Timing of initiation of kidney-replacement therapy (KRT) in critically ill patients remains controversial. The Standard versus Accelerated Initiation of Renal-Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial compared two strategies of KRT initiation (accelerated versus standard) in critically ill patients with acute kidney injury and found neutral results for 90-day all-cause mortality. Probabilistic exploration of the trial endpoints may enable greater understanding of the trial findings. We aimed to perform a reanalysis using a Bayesian framework.
Methods
We performed a secondary analysis of all 2927 patients randomized in multi-national STARRT-AKI trial, performed at 168 centers in 15 countries. The primary endpoint, 90-day all-cause mortality, was evaluated using hierarchical Bayesian logistic regression. A spectrum of priors includes optimistic, neutral, and pessimistic priors, along with priors informed from earlier clinical trials. Secondary endpoints (KRT-free days and hospital-free days) were assessed using zero–one inflated beta regression.
Results
The posterior probability of benefit comparing an accelerated versus a standard KRT initiation strategy for the primary endpoint suggested no important difference, regardless of the prior used (absolute difference of 0.13% [95% credible interval [CrI] − 3.30%; 3.40%], − 0.39% [95% CrI − 3.46%; 3.00%], and 0.64% [95% CrI − 2.53%; 3.88%] for neutral, optimistic, and pessimistic priors, respectively). There was a very low probability that the effect size was equal or larger than a consensus-defined minimal clinically important difference. Patients allocated to the accelerated strategy had a lower number of KRT-free days (median absolute difference of − 3.55 days [95% CrI − 6.38; − 0.48]), with a probability that the accelerated strategy was associated with more KRT-free days of 0.008. Hospital-free days were similar between strategies, with the accelerated strategy having a median absolute difference of 0.48 more hospital-free days (95% CrI − 1.87; 2.72) compared with the standard strategy and the probability that the accelerated strategy had more hospital-free days was 0.66.
Conclusions
In a Bayesian reanalysis of the STARRT-AKI trial, we found very low probability that an accelerated strategy has clinically important benefits compared with the standard strategy. Patients receiving the accelerated strategy probably have fewer days alive and KRT-free. These findings do not support the adoption of an accelerated strategy of KRT initiation
Initiation of continuous renal replacement therapy versus intermittent hemodialysis in critically ill patients with severe acute kidney injury: a secondary analysis of STARRT-AKI trial
Background: There is controversy regarding the optimal renal-replacement therapy (RRT) modality for critically ill patients with acute kidney injury (AKI). Methods: We conducted a secondary analysis of the STandard versus Accelerated Renal Replacement Therapy in Acute Kidney Injury (STARRT-AKI) trial to compare outcomes among patients who initiated RRT with either continuous renal replacement therapy (CRRT) or intermittent hemodialysis (IHD). We generated a propensity score for the likelihood of receiving CRRT and used inverse probability of treatment with overlap-weighting to address baseline inter-group differences. The primary outcome was a composite of death or RRT dependence at 90-days after randomization. Results: We identified 1590 trial participants who initially received CRRT and 606 who initially received IHD. The composite outcome of death or RRT dependence at 90-days occurred in 823 (51.8%) patients who commenced CRRT and 329 (54.3%) patients who commenced IHD (unadjusted odds ratio (OR) 0.90; 95% confidence interval (CI) 0.75-1.09). After balancing baseline characteristics with overlap weighting, initial receipt of CRRT was associated with a lower risk of death or RRT dependence at 90-days compared with initial receipt of IHD (OR 0.81; 95% CI 0.66-0.99). This association was predominantly driven by a lower risk of RRT dependence at 90-days (OR 0.61; 95% CI 0.39-0.94). Conclusions: In critically ill patients with severe AKI, initiation of CRRT, as compared to IHD, was associated with a significant reduction in the composite outcome of death or RRT dependence at 90-days