8 research outputs found

    Sources and Mechanisms of Low-Flow River Phosphorus Elevations:A Repeated Synoptic Survey Approach

    Get PDF
    peer-reviewedHigh-resolution water quality monitoring indicates recurring elevation of stream phosphorus concentrations during low-flow periods. These increased concentrations may exceed Water Framework Directive (WFD) environmental quality standards during ecologically sensitive periods. The objective of this research was to identify source, mobilization, and pathway factors controlling in-stream total reactive phosphorus (TRP) concentrations during low-flow periods. Synoptic surveys were conducted in three agricultural catchments during spring, summer, and autumn. Up to 50 water samples were obtained across each watercourse per sampling round. Samples were analysed for TRP and total phosphorus (TP), along with supplementary parameters (temperature, conductivity, dissolved oxygen, and oxidation reduction potential). Bed sediment was analysed at a subset of locations for Mehlich P, Al, Ca, and Fe. The greatest percentages of water sampling points exceeding WFD threshold of 0.035 mg L−1 TRP occurred during summer (57%, 11%, and 71% for well-drained, well-drained arable, and poorly drained grassland catchments, respectively). These percentages declined during autumn but did not return to spring concentrations, as winter flushing had not yet occurred. Different controls were elucidated for each catchment: diffuse transport through groundwater and lack of dilution in the well-drained grassland, in-stream mobilization in the well-drained arable, and a combination of point sources and cumulative loading in the poorly drained grassland. Diversity in controlling factors necessitates investigative protocols beyond low-spatial and temporal resolution water sampling and must incorporate both repeated survey and complementary understanding of sediment chemistry and anthropogenic phosphorus sources. Despite similarities in elevation of P at low-flow, catchments will require custom solutions depending on their typology, and both legislative deadlines and target baselines standards must acknowledge these inherent differences

    Incidental nutrient transfers: Assessing critical times in agricultural catchments using high-resolution data

    Get PDF
    AbstractManaging incidental losses associated with liquid slurry applications during closed periods has significant cost and policy implications and the environmental data required to review such a measure are difficult to capture due to storm dependencies. Over four years (2010–2014) in five intensive agricultural catchments, this study used high-resolution total and total reactive phosphorus (TP and TRP), total oxidised nitrogen (TON) and suspended sediment (SS) concentrations with river discharge data to investigate the magnitude and timing of nutrient losses. A large dataset of storm events (defined as 90th percentile discharges), and associated flow-weighted mean (FWM) nutrient concentrations and TP/SS ratios, was used to indicate when losses were indicative of residual or incidental nutrient transfers. The beginning of the slurry closed period was reflective of incidental and residual transfers with high storm FWM P (TP and TRP) concentrations, with some catchments also showing elevated storm TP:SS ratios. This pattern diminished at the end of the closed period in all catchments. Total oxidised N behaved similarly to P during storms in the poorly drained catchments and revealed a long lag time in other catchments. Low storm FWM P concentrations and TP:SS ratios during the weeks following the closed period suggests that nutrients either weren't applied during this time (best times chosen) or that they were applied to less risky areas (best places chosen). For other periods such as late autumn and during wet summers, where storm FWM P concentrations and TP:SS ratios were high, it is recommended that an augmentation of farmer knowledge of soil drainage characteristics with local and detailed current and forecast soil moisture conditions will help to strengthen existing regulatory frameworks to avoid storm driven incidental nutrient transfers

    A framework for determining unsaturated zone time lags at catchment scale

    Get PDF
    The responses of waterbodies to agricultural programmes of measures are frequently delayed by hydrological time lags through the unsaturated zone and groundwater. Time lag may therefore, impede the achievement of remediation deadlines such as those described in the EU Water Framework Directive (WFD). Omitting time lag from catchment characterisation renders evaluation of management practices impossible. Time lag aside, regulators at national scale can only manage the expectations of policy-makers at larger scales (e.g. European Union) by demonstrating positive nutrient trajectories in catchments failing to achieve at least good status. Presently, a flexible tool for developing spatial and temporal estimates of trends in water quality/nutrient transport and time lags is not available. The objectives of the present study were first to develop such a flexible, parsimonious framework incorporating existing soil maps, meteorological data and a structured modelling approach, and to secondly, to demonstrate its use in a grassland and an arable catchment (~10 km2) in Ireland, assuming full implementation of measures in 2012. Data pertaining to solute transport (meteorology, soil hydraulics, depth of profile and boundary conditions) were collected for both catchments. Low complexity textural data alone gave comparable estimates of nutrient trajectories and time lags but with no spatial or soil series information. Taking a high complexity approach, coupling high resolution soil mapping (1:10,000) with national scale (1:25,000) representative profile datasets toThis research was funded by the Teagasc Walsh Fellowship Scheme.peer-reviewed2018-12-1

    A framework for determining unsaturated zone time lags at catchment scale

    No full text
    The responses of waterbodies to agricultural programmes of measures are frequently delayed by hydrological time lags through the unsaturated zone and groundwater. Time lag may therefore, impede the achievement of remediation deadlines such as those described in the EU Water Framework Directive (WFD). Omitting time lag from catchment characterisation renders evaluation of management practices impossible. Time lag aside, regulators at national scale can only manage the expectations of policy-makers at larger scales (e.g. European Union) by demonstrating positive nutrient trajectories in catchments failing to achieve at least good status. Presently, a flexible tool for developing spatial and temporal estimates of trends in water quality/nutrient transport and time lags is not available. The objectives of the present study were first to develop such a flexible, parsimonious framework incorporating existing soil maps, meteorological data and a structured modelling approach, and to secondly, to demonstrate its use in a grassland and an arable catchment (~10 km2) in Ireland, assuming full implementation of measures in 2012. Data pertaining to solute transport (meteorology, soil hydraulics, depth of profile and boundary conditions) were collected for both catchments. Low complexity textural data alone gave comparable estimates of nutrient trajectories and time lags but with no spatial or soil series information. Taking a high complexity approach, coupling high resolution soil mapping (1:10,000) with national scale (1:25,000) representative profile datasets toThis research was funded by the Teagasc Walsh Fellowship Scheme.2018-12-1

    Improving the identification of hydrologically sensitive areas using LiDAR DEMs for the delineation and mitigation of critical source areas of diffuse pollution

    No full text
    Identifying critical source areas (CSAs) of diffuse pollution in agricultural catchments requires the accurate identification of hydrologically sensitive areas (HSAs) at highest propensity for generating surface runoff and transporting pollutants. A new GIS-based HSA Index is presented that improves the identification of HSAs at the sub-field scale by accounting for microtopographic controls. The Index is based on high resolution LiDAR data and a soil topographic index (STI) and also considers the hydrological disconnection of overland flow via topographic impediment from flow sinks. The HSA Index was applied to four intensive agricultural catchments (~ 7.5–12 km2) with contrasting topography and soil types, and validated using rainfall-quickflow measurements during saturated winter storm events in 2009–2014. Total flow sink volume capacities ranged from 8298 to 59,584 m3 and caused 8.5–24.2% of overland-flow-generating-areas and 16.8–33.4% of catchment areas to become hydrologically disconnected from the open drainage channel network. HSA maps identified ‘breakthrough points’ and ‘delivery points’ along surface runoff pathways as vulnerable points where diffuse pollutants could be transported between fields or delivered to the open drainage network, respectively. Using these as proposed locations for targeting mitigation measures such as riparian buffer strips reduced potential costs compared to blanket implementation within an example agri-environment scheme by 66% and 91% over 1 and 5 years respectively, which included LiDAR DEM acquisition costs. The HSA Index can be used as a hydrologically realistic transport component within a fully evolved sub-field scale CSA model, and can also be used to guide the implementation of ‘treatment-train’ mitigation strategies concurrent with sustainable agricultural intensification
    corecore