24 research outputs found

    Persistence and quality of vegetation cover in expired Conservation Reserve Program fields

    Get PDF
    For nearly 40 years, the Conservation Reserve Program (CRP) has implemented practices to reduce soil erosion, improve water quality, and provide habitat for wildlife and pollinators on highly erodible cropland in the United States. However, an approximately 40,470 ha (10 million acres) decline in enrolled CRP land over the last decade has greatly reduced the program\u27s environmental benefits. We sought to assess the program\u27s enduring benefits in the central and western United States by (1) determining the proportion of fields that persist in CRP cover after contracts expired, (2) identifying the type of agricultural production that CRP fields shift to after contract expiration, (3) comparing the vegetation characteristics of expired CRP fields that are persisting in CRP-type cover with enrolled CRP fields, and (4) identifying differences in management activities (e.g., haying, grazing) between expired and enrolled CRP fields. We conducted edge-of-field vegetation cover surveys in 1092 CRP fields with contracts that expired ≥3 years prior and 1786 currently enrolled CRP fields in 14 states. We found that 41% of expired CRP fields retained at least half of their area in CRP-type cover, with significant variation in persistence among regions ranging from 19% to 84%. When expired fields retained CRP vegetation, bare ground was low in all regions and grass cover was somewhat greater than in fields with current CRP contracts, but at the expense of forb cover in some regions. Evidence of more frequent management in expired CRP fields may explain differences between active and expired CRP fields. Overall, there is clear evidence that CRP-type cover frequently persists and provides benefits for more than three years after contract expiration. Retaining CRP-type cover, post-contract, is an under-recognized program benefit that persists across the central and western United States long after the initial retirement from cropland

    Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study

    Get PDF
    Purpose: Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods: Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results: The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion: We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes

    Cumulative Effects of Fire and Fuels Management on Stream Water Quality and Ecosystem Dynamics

    Get PDF
    Prescribed fires and wildland fire-use are increasingly important management tools used to reduce fuel loads and restore the ecological integrity of western forests. Although a basic understanding of the effects of fire on aquatic ecosystems exists, the cumulative and possibly synergistic effects of wildfire following prescribed fire are unknown. Wildfires following prescribed fire may produce different burn severities and effects on riparian and stream ecosystems than wildfires in fire suppressed forests (e.g., fires absent \u3e70 yrs) or prescribed fires alone. The goal of this study was to quantify and compare the effects of wildfire on stream and riparian ecosystems under three fire management practices: (1) wildfire following prescribed fire, (2) wildfire in fire suppressed forests, and (3) wildfire occurring at historic fire return intervals. We compared 6-7 years (2001-2006/07) of stream and riparian data collected prior to two large wildfire events to 3 years (2008-2010) of similar data collected after wildfire in catchments along the South Fork Salmon River and Big Creek in central Idaho. Here we report our preliminary findings on riparian- and catchment-level burn severity patterns, riparian forest structure, hydrology, amphibians, aquatic macroinvertebrates, periphyton, and instream habitat, including temperature, chemistry, substrate, sedimentation, and large woody debris. We found that the management practice of prescribed fire treatment prior to wildfire significantly reduced wildfire burn severity patterns in treated catchments relative to untreated catchments. This reduction in burn severity appeared to reduce wildfire effects on stream and riparian ecosystems rather than cause cumulative effects of prescribed fire plus wildfire. Instead, we found that the effects of natural inter-annual variability in stream flow and stochastic disturbances, such as debris flows and channel scouring events, are the dominant drivers of change in stream and riparian habitats in this region, with fire management practices playing a much smaller role

    Stream Restoration Is Influenced by Details of Engineered Habitats at a Headwater Mine Site

    No full text
    A lack of information regarding which ecological factors influence restoration success or failure has hindered scientifically based restoration decision-making. We focus on one headwater site to examine factors influencing divergent ecological outcomes of two post-mining stream restoration projects designed to improve instream conditions following 70 years of mining impacts. One project was designed to simulate natural stream conditions by creating a morphologically complex channel with high habitat heterogeneity (HH-reach). A second project was designed to reduce contaminants and sediment using a sand filter along a straight, armored channel, which resulted in different habitat characteristics and comparatively low habitat heterogeneity (LH-reach). Within 2 years of completion, stream habitat parameters and community composition within the HH-reach were similar to those of reference reaches. In contrast, habitat and community composition within the LH-reach differed substantially from reference reaches, even 7–8 years after project completion. We found that an interaction between low gradient and high light availability, created by the LH-reach design, facilitated a Chironomid-Nostoc mutualism. These symbionts dominated the epilithic surface of rocks and there was little habitat for tailed frog larvae, bioavailable macroinvertebrates, and fish. After controlling for habitat quantity, potential colonizing species’ traits, and biogeographic factors, we found that habitat characteristics combined to facilitate different ecological outcomes, whereas time since treatment implementation was less influential. We demonstrate that stream communities can respond quickly to restoration of physical characteristics and increased heterogeneity, but “details matter” because interactions between the habitats we create and between the species that occupy them can be complex, unpredictable, and can influence restoration effectiveness

    Performance of Quantitative Vegetation Sampling Methods Across Gradients of Cover in Great Basin Plant Communities

    No full text
    Resource managers and scientists need efficient, reliable methods for quantifying vegetation to conduct basic research, evaluate land management actions, and monitor trends in habitat conditions. We examined three methods for quantifying vegetation in 1-ha plots among different plant communities in the northern Great Basin: photography-based grid-point intercept (GPI), linepoint intercept (LPI), and point-quarter (PQ). We also evaluated each method for within-plot subsampling adequacy and effort requirements relative to information gain. We found that, for most functional groups, percent cover measurements collected with the use of LPI, GPI, and PQ methods were strongly correlated. These correlations were even stronger when we used data from the upper canopy only (i.e., top “hit” of pin flags) in LPI to estimate cover. PQ was best at quantifying cover of sparse plants such as shrubs in early successional habitats. As cover of a given functional group decreased within plots, the variance of the cover estimate increased substantially, which required more subsamples per plot (i.e., transect lines, quadrats) to achieve reliable precision. For GPI, we found that that six-nine quadrats per hectare were sufficient to characterize the vegetation in most of the plant communities sampled. All three methods reasonably characterized the vegetation in our plots, and each has advantages depending on characteristics of the vegetation, such as cover or heterogeneity, study goals, precision of measurements required, and efficiency needed.The Rangeland Ecology & Management archives are made available by the Society for Range Management and the University of Arizona Libraries. Contact [email protected] for further information.Migrated from OJS platform August 202

    Pilliod_et_al_2013_MER_DryadData_7-31-13

    No full text
    We handled, stored, and extracted eDNA from 0.45 nm Cellulose Nitrate filter paper the same way for each experiment using protocols described in Pilliod et al. (2013). We extracted DNA from half of each filter paper using the Qiashredder/DNeasy Blood & Tissue DNA extraction kit method described in Goldberg et al. (2011). We used the quantitative PCR assay described in Pilliod et al. (2013) to estimate the amount of eDNA in each sample. We used the QuantiTect Multiplex PCR Mix (Qiagen, Inc., Gaithersburg, Maryland, USA) with recommended multiplexing concentrations and parameters on an Applied Biosystems 7500 Fast Real-Time PCR System to conduct the assay. We used 2 µL of DNA extract in each reaction and ran all reactions in triplicate. Standard curves were constructed from whole genomic DNA extracted from tail tissue and diluted to 0.001, 0.01, and 0.1 ng. Average r2 for curves used in this study was 0.99. We considered a test result negative if there was no exponential phase at any point during 50 cycles. See Pilliod et al. (2013) for additional methodological details

    Data from: Factors influencing detection of eDNA from a stream-dwelling amphibian

    No full text
    Environmental DNA (eDNA) methods for detecting and estimating abundance of aquatic species are emerging rapidly, but little is known about how processes such as secretion rate, environmental degradation, and time since colonization or extirpation from a given site affect eDNA measurements. Using stream-dwelling salamanders and quantitative PCR (qPCR) analysis, we conducted three experiments to assess eDNA: (1) production rate, (2) persistence time under different temperature and light conditions, and (3) detectability and concentration through time following experimental introduction and removal of salamanders into previously unoccupied streams. We found that 44–50 g individuals held in aquaria produced 77 ng eDNA/hr for two hours, after which production either slowed considerably or began to equilibrate with degradation. eDNA in both full-sun and shaded treatments degraded exponentially to <1% of the original concentration after 3 days. eDNA was no longer detectable in full-sun samples after 8 days, whereas eDNA was detected in 20% of shaded samples after 11 days and 100% of refrigerated control samples after 18 days. When translocated into unoccupied streams, salamanders were detectable after 6 hours, but only when densities were relatively high (0.2481 individuals/m2) and when samples were collected within 5 m of the animals. Concentrations of eDNA detected were very low and increased steadily from 6–24 hours after introduction, reaching 0.0022 ng/L. Within 1 hour of removing salamanders from the stream, eDNA was no longer detectable. These results suggest that eDNA detectability and concentration depend on production rates of individuals, environmental conditions, density of animals, and their residence time

    Molecular detection of vertebrates in stream water: a demonstration using Rocky Mountain tailed frogs and Idaho giant salamanders

    Get PDF
    Stream ecosystems harbor many secretive and imperiled species, and studies of vertebrates in these systems face the challenges of relatively low detection rates and high costs. Environmental DNA (eDNA) has recently been confirmed as a sensitive and efficient tool for documenting aquatic vertebrates in wetlands and in a large river and canal system. However, it was unclear whether this tool could be used to detect low-density vertebrates in fast-moving streams where shed cells may travel rapidly away from their source. To evaluate the potential utility of eDNA techniques in stream systems, we designed targeted primers to amplify a short, species-specific DNA fragment for two secretive stream amphibian species in the northwestern region of the United States (Rocky Mountain tailed frogs, Ascaphus montanus, and Idaho giant salamanders, Dicamptodon aterrimus). We tested three DNA extraction and five PCR protocols to determine whether we could detect eDNA of these species in filtered water samples from five streams with varying densities of these species in central Idaho, USA. We successfully amplified and sequenced the targeted DNA regions for both species from stream water filter samples. We detected Idaho giant salamanders in all samples and Rocky Mountain tailed frogs in four of five streams and found some indication that these species are more difficult to detect using eDNA in early spring than in early fall. While the sensitivity of this method across taxa remains to be determined, the use of eDNA could revolutionize surveys for rare and invasive stream species. With this study, the utility of eDNA techniques for detecting aquatic vertebrates has been demonstrated across the majority of freshwater systems, setting the stage for an innovative transformation in approaches for aquatic research
    corecore