411 research outputs found

    Investigating the Potential for Nitrate-N Removal in Rhode Island Transient Headwater Streams

    Get PDF
    Excess nitrogen (N) can have detrimental effects on the environment, particularly in coastal waters where inputs from septic systems and agricultural runoff can lead to algal blooms and hypoxic zones. However, transient headwater streams, which comprise a significant portion of streams in watersheds, may have the potential to remove N given their low flow rates, high surface to volume ratios, long retention times, and hydric soils. We investigated the physical characteristics and N removal capacity of transient headwater streams. Four bromide (Br) and nitrate-N slug tests were conducted in four streams in southern RI. Streams were sampled repeatedly as the plume traveled 30 m. Nitrogen removal was determined by changes in the Br:N ratio from start to end. Three out of the four slug tests demonstrated substantial N removal – 25-65% N removal. Removal occurred towards the end of the slug tests due to hyporheic interactions. Streams with longer retention times demonstrated greater N removal. Transient headwater streams may be important N sinks and future research should focus on determining the in-stream processes that facilitate N removal

    Riparian Zone Nitrogen Management through the Development of the Riparian Ecosystem Management Model (REMM) in a Formerly Glaciated Watershed of the US Northeast

    Get PDF
    The Riparian Ecosystem Management Model (REMM) was developed, calibrated and validated for both hydrologic and water quality data for eight riparian buffers located in a formerly glaciated watershed (upper Pawcatuck River Watershed, Rhode Island) of the US Northeast. The Annualized AGricultural Non-Point Source model (AnnAGNPS) was used to predict the runoff and sediment loading to the riparian buffer. Overall, results showed REMM simulated water table depths (WTDs) and groundwater NO3-N concentrations at the stream edge (Zone 1) in good agreement with measured values. The model evaluation statistics showed that, hydrologically REMM performed better for site 1, site 4, and site 8 among the eight buffers, whereas REMM simulated better groundwater NO3-N concentrations in the case of site 1, site 5, and site 7 when compared to the other five sites. The interquartile range of mean absolute error for WTDs was 3.5 cm for both the calibration and validation periods. In the case of NO3-N concentrations prediction, the interquartile range of the root mean square error was 0.25 mg/L and 0.69 mg/L for the calibration and validation periods, respectively, whereas the interquartile range of d for NO3-N concentrations was 0.20 and 0.48 for the calibration and validation period, respectively. Moreover, REMM estimation of % N-removal from Zone 3 to Zone 1 was 19.7%, and 19.8% of N against actual measured 19.1%, and 26.6% of N at site 7 and site 8, respectively. The sensitivity analyses showed that changes in the volumetric water content between field capacity and saturation (soil porosity) were driving water table and denitrification

    Evaluation of AnnAGNPS Model for Runoff Simulation on Watersheds from Glaciated Landscape of USA Midwest and Northeast

    Get PDF
    Runoff modeling of glaciated watersheds is required to predict runoff for water supply, aquatic ecosystem management and flood prediction, and to deal with questions concerning the impact of climate and land use change on the hydrological system and watershed export of contaminants of glaciated watersheds. A widely used pollutant loading model, Annualized Agricultural Non-Point Source Pollution (AnnAGNPS) was applied to simulate runoff from three watersheds in glaciated geomorphic settings. The objective of this study was to evaluate the suitability of the AnnAGNPS model in glaciated landscapes for the prediction of runoff volume. The study area included Sugar Creek watershed, Indiana; Fall Creek watershed, New York; and Pawcatuck River watershed, Rhode Island, USA. The AnnAGNPS model was developed, calibrated and validated for runoff estimation for these watersheds. The daily and monthly calibration and validation statistics (NSE \u3e 0.50 and RSR \u3c 0.70, and PBIAS ± 25%) of the developed model were satisfactory for runoff simulation for all the studied watersheds. Once AnnAGNPS successfully simulated runoff, a parameter sensitivity analysis was carried out for runoff simulation in all three watersheds. The output from our hydrological models applied to glaciated areas will provide the capacity to couple edge-of-field hydrologic modeling with the examination of riparian or riverine functions and behaviors

    Denitrifying Bioreactors for Nitrate Removal: A Meta-Analysis

    Get PDF
    Meta-analysis approaches were used in this first quantitative synthesis of denitrifying woodchip bioreactors. Nitrate removal across environmental and design conditions was assessed from 26 published studies, representing 57 separate bioreactor units (i.e., walls, beds, and laboratory columns). Effect size calculations weighted the data based on variance and number of measurements for each bioreactor unit. Nitrate removal rates in bed and column studies were not significantly different, but both were significantly higher than wall studies. In denitrifying beds, wood source did not significantly affect nitrate removal rates. Nitrate removal (mass per volume) was significantly lower in beds with \u3c6-h hydraulic retention times, which argues for ensuring that bed designs incorporate sufficient time for nitrate removal. Rates significantly declined after the first year of bed operation but then stabilized. Nitrogen limitation significantly affected bed nitrate removal. Categorical and linear assessments found significant nitrate removal effects with bed temperature; a Q10 of 2.15 was quite similar to other studies. Lessons from this meta-analysis can be incorporated into bed designs, especially extending hydraulic retention times to increase nitrate removal under low temperature and high flow conditions. Additional column studies are warranted for comparative assessments, as are field-based studies for assessing in situ conditions, especially in aging beds, with careful collection and reporting of design and environmental data. Future assessment of these systems might take a holistic view, reviewing nitrate removal in conjunction with other processes, including greenhouse gas and other unfavorable by-product production

    Beaver Ponds: Resurgent Nitrogen Sinks for Rural Watersheds in the Northeastern United States

    Get PDF
    Beaver-created ponds and dams, on the rise in the northeastern United States, reshape headwater stream networks from extensive, free-flowing reaches to complexes of ponds, wetlands, and connecting streams. We examined seasonal and annual rates of nitrate transformations in three beaver ponds in Rhode Island under enriched nitrate-nitrogen (N) conditions through the use of 15N mass balance techniques on soil core mesocosm incubations. We recovered approximately 93% of the nitrate N from our mesocosm incubations. Of the added nitrate N, 22 to 39% was transformed during the course of the incubation. Denitrification had the highest rates of transformation (97–236 mg N m−2 d−1), followed by assimilation into the organic soil N pool (41–93 mg N m−2 d−1) and ammonium generation (11–14 mg N m−2 d−1). Our denitrification rates exceeded those in several studies of freshwater ponds and wetlands; however, rates in those ecosystems may have been limited by low concentrations of nitrate. Assuming a density of 0.7 beaver ponds km−2 of catchment area, we estimated that in nitrate-enriched watersheds, beaver pond denitrification can remove approximately 50 to 450 kg nitrate N km−2 catchment area. In rural watersheds of southern New England with high N loading (i.e., 1000 kg km−2), denitrification from beaver ponds may remove 5 to 45% of watershed nitrate N loading. Beaver ponds represent a relatively new and substantial sink for watershed N if current beaver populations persist

    Connectivity and Nitrate Uptake Potential of Intermittent Streams in the Northeast USA

    Get PDF
    Non-perennial streams dominate the extent of stream networks worldwide. Intermittent streams can provide ecosystem services to the entire network—including nitrate uptake to alleviate eutrophication of coastal waters—and are threatened by lack of legal protection. We examined 12 intermittent streams in the temperate, humid climate of the Northeast USA. Over 3 years of monitoring, continuous flow was observed a median of 277 d yr−1, with no-flow conditions from early summer into fall. Estimated median discharge was 2.9 L s−1 or 0.36mm d−1. All intermittent streams originated from source wetlands (median area: 0.27 ha) and the median length of the intermittent stream from the source wetland to the downstream perennial stream was 344m. Through regional geospatial analysis with high resolution orthophotography, we estimated that widely available, “high resolution” (1:24,000) hydrography databases (e.g., NHDPlus HR) only displayed 43% of the total number of intermittent streams. Whole-stream gross nitrate-N uptake rates were estimated at six intermittent streams during continuous flow conditions using pulse additions of nitrate and a conservative tracer. These rates displayed high temporal variability (range: no detect to over 6,000mg N m−1 d−1); hot moments were noted in nine of the 65 pulse additions. Whole-stream gross nitrate-N uptake rates were significantly inversely related to discharge, with no measurable rates above 7 L s−1. Temperature was significantly positively correlated with whole-stream gross nitrate-N uptake rates, with more hot moments in the spring. Microbial assays demonstrated that nitrate cycling in intermittent streams are consistent with results from low order, perennial forested streams and highlighted the importance of debris dams and pools—potential locations for transient storage. Our assessment suggests that intermittent streams in our region may annually contribute 24–47% of the flow to perennial streams and potentially remove 4.1 to 80.4 kg nitrate-N km−2 annually. If development in these areas continues, perennial streams are in danger of losing a portion of their headwaters and potential nitrate uptake areas may become nitrate sources to downstream areas. These results argue to manage fluvial systems with a holistic approach that couples intermittent and perennial components

    Ghosts of Landuse Past: Legacy Effects of Milldams for Riparian Nitrogen (N) Processing and Water Quality Functions

    Get PDF
    Milldams and their legacies have significantly influenced fluvial processes and geomorphology. However, less is known about their effects on riparian zone hydrology, biogeochemistry, and water quality. Here, we discuss the potential effects of existing and breached milldams on riparian nitrogen (N) processing through multiple competing hypotheses and observations from complementary studies. Competing hypotheses characterize riparian zone processes that remove (sink) or release (source) N. Elevated groundwater levels and reducing soil conditions upstream of milldams suggest that riparian zones above dams could be hotspots for N removal via denitrification and plant N uptake. On the other hand, dam removals and subsequent drops in stream and riparian groundwater levels result in drained, oxic soils which could increase soil nitrification and decrease riparian plant uptake due to groundwater bypassing the root zone. Whether dam removals would result in a net increase or decrease of N in riparian groundwaters is unknown and needs to be investigated. While nitrification, denitrification, and plant N uptake have typically received the most attention in riparian studies, other N cycle processes such as dissimilatory nitrate reduction to ammonium (DNRA) need to be considered. We also propose a novel concept of riparian discontinuum, which highlights the hydrologic and biogeochemical discontinuities introduced in riparian zones by anthropogenic structures such as milldams. Understanding and quantifying how milldams and similar structures influence the net source or sink behavior of riparian zones is urgently needed for guiding watershed management practices and for informed decision making with regard to dam removals

    Of cattle, sand flies and men : a systematic review of risk factor analyses for South Asian visceral leishmaniasis and implications for elimination

    Get PDF
    Background: Studies performed over the past decade have identified fairly consistent epidemiological patterns of risk factors for visceral leishmaniasis (VL) in the Indian subcontinent. Methods and Principal Findings: To inform the current regional VL elimination effort and identify key gaps in knowledge, we performed a systematic review of the literature, with a special emphasis on data regarding the role of cattle because primary risk factor studies have yielded apparently contradictory results. Because humans form the sole infection reservoir, clustering of kala-azar cases is a prominent epidemiological feature, both at the household level and on a larger scale. Subclinical infection also tends to show clustering around kala-azar cases. Within villages, areas become saturated over a period of several years; kala-azar incidence then decreases while neighboring areas see increases. More recently, post kalaazar dermal leishmaniasis (PKDL) cases have followed kala-azar peaks. Mud walls, palpable dampness in houses, and peridomestic vegetation may increase infection risk through enhanced density and prolonged survival of the sand fly vector. Bed net use, sleeping on a cot and indoor residual spraying are generally associated with decreased risk. Poor micronutrient status increases the risk of progression to kala-azar. The presence of cattle is associated with increased risk in some studies and decreased risk in others, reflecting the complexity of the effect of bovines on sand fly abundance, aggregation, feeding behavior and leishmanial infection rates. Poverty is an overarching theme, interacting with individual risk factors on multiple levels. Conclusions: Carefully designed demonstration projects, taking into account the complex web of interconnected risk factors, are needed to provide direct proof of principle for elimination and to identify the most effective maintenance activities to prevent a rapid resurgence when interventions are scaled back. More effective, short-course treatment regimens for PKDL are urgently needed to enable the elimination initiative to succeed

    Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors

    Get PDF
    Background: Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries. Methods: In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants. Findings: 45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups. Interpretation: Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency. Funding: NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
    corecore