9 research outputs found

    Quantifying MCPA load pathways at catchment scale using high temporal resolution data

    Get PDF
    peer-reviewedDetection of the agricultural acid herbicide MCPA (2-methyl-4-chlorophenoxyacetic acid) in drinking water source catchments is of growing concern, with economic and environmental implications for water utilities and wider ecosystem services. MCPA is poorly adsorbed to soil and highly mobile in water, but hydrological pathway processes are relatively unknown at the catchment scale and limited by coarse resolution data. This understanding is required to target mitigation measures and to provide a framework to monitor their effectiveness. To address this knowledge gap, this study reports findings from river discharge and synchronous MCPA concentration datasets (continuous 7 hour and with additional hourly sampling during storm events) collected over a 7 month herbicide spraying season. The study was undertaken in a surface (source) water catchment (384 km2—of which 154 km2 is agricultural land use) in the cross-border area of Ireland. Combined into loads, and using two pathway separation techniques, the MCPA data were apportioned into event and baseload components and the former was further separated to quantify a quickflow (QF) and other event pathways. Based on the 7 hourly dataset, 85.2 kg (0.22 kg km−2 by catchment area, or 0.55 kg km−2 by agricultural area) of MCPA was exported from the catchment in 7 months. Of this load, 87.7 % was transported via event flow pathways with 72.0 % transported via surface dominated (QF) pathways. Approximately 12 % of the MCPA load was transported via deep baseflows, indicating a persistence in this delayed pathway, and this was the primary pathway condition monitored in a weekly regulatory sampling programme. However, overall, the data indicated a dominant acute, storm dependent process of incidental MCPA loss during the spraying season. Reducing use and/or implementing extensive surface pathway disconnection measures are the mitigation options with greatest potential, the success of which can only be assessed using high temporal resolution monitoring techniques

    Comparing in situ turbidity sensor measurements as a proxy for suspended sediments in North-Western European streams

    Get PDF
    Climate change in combination with land use alterations may lead to significant changes in soil erosion and sediment fluxes in streams. Optical turbidity sensors can monitor with high frequency and can be used as a proxy for suspended sediment concentration (SSC) provided there is an acceptable calibration curve for turbidity measured by sensors and SSC from water samples. This study used such calibration data from 31 streams in 11 different research projects or monitoring programmes in six Northern European countries. The aim was to find patterns in the turbidity-SSC correlations based on stream characteristics such as mean and maximum turbidity and SSC, catchment area, land use, hydrology, soil type, topography, and the number and representativeness of the data that are used for the calibration. There were large variations, but the best correlations between turbidity and SSC were found in streams with a mean and maximum SSC of >30-200 mg/l, and a mean and maximum turbidity above 60-200 NTU/FNU, respectively. Streams draining agricultural areas with fine-grained soils had better correlations than forested streams draining more coarse-grained soils. However, the study also revealed considerable differences in methodological approaches, including analytical methods to determine SSC, water sampling strategies, quality control procedures, and the use of sensors based on different measuring principles. Relatively few national monitoring programmes in the six countries involved in the study included optical turbidity sensors, which may partly explain this lack of methodological harmonisation. Given the risk of future changes in soil erosion and sediment fluxes, increased harmonisation is highly recommended, so that turbidity data from optical sensors can be better evaluated and intercalibrated across streams in comparable geographical regions

    The James Webb Space Telescope Mission

    Full text link
    Twenty-six years ago a small committee report, building on earlier studies, expounded a compelling and poetic vision for the future of astronomy, calling for an infrared-optimized space telescope with an aperture of at least 4m4m. With the support of their governments in the US, Europe, and Canada, 20,000 people realized that vision as the 6.5m6.5m James Webb Space Telescope. A generation of astronomers will celebrate their accomplishments for the life of the mission, potentially as long as 20 years, and beyond. This report and the scientific discoveries that follow are extended thank-you notes to the 20,000 team members. The telescope is working perfectly, with much better image quality than expected. In this and accompanying papers, we give a brief history, describe the observatory, outline its objectives and current observing program, and discuss the inventions and people who made it possible. We cite detailed reports on the design and the measured performance on orbit.Comment: Accepted by PASP for the special issue on The James Webb Space Telescope Overview, 29 pages, 4 figure

    Comparing in situ turbidity sensor measurements as a proxy for suspended sediments in North-Western European streams

    No full text
    Abstract Climate change in combination with land use alterations may lead to significant changes in soil erosion and sediment fluxes in streams. Optical turbidity sensors can monitor with high frequency and can be used as a proxy for suspended sediment concentration (SSC) provided there is an acceptable calibration curve for turbidity measured by sensors and SSC from water samples. This study used such calibration data from 31 streams in 11 different research projects or monitoring programmes in six Northern European countries. The aim was to find patterns in the turbidity-SSC correlations based on stream characteristics such as mean and maximum turbidity and SSC, catchment area, land use, hydrology, soil type, topography, and the number and representativeness of the data that are used for the calibration. There were large variations, but the best correlations between turbidity and SSC were found in streams with a mean and maximum SSC of >30–200 mg/l, and a mean and maximum turbidity above 60–200 NTU/FNU, respectively. Streams draining agricultural areas with fine-grained soils had better correlations than forested streams draining more coarse-grained soils. However, the study also revealed considerable differences in methodological approaches, including analytical methods to determine SSC, water sampling strategies, quality control procedures, and the use of sensors based on different measuring principles. Relatively few national monitoring programmes in the six countries involved in the study included optical turbidity sensors, which may partly explain this lack of methodological harmonisation. Given the risk of future changes in soil erosion and sediment fluxes, increased harmonisation is highly recommended, so that turbidity data from optical sensors can be better evaluated and intercalibrated across streams in comparable geographical regions

    Generic acquisition protocol for quantitative MRI of the spinal cord

    Get PDF
    Quantitative spinal cord (SC) magnetic resonance imaging (MRI) presents many challenges, including a lack of standardized imaging protocols. Here we present a prospectively harmonized quantitative MRI protocol, which we refer to as the spine generic protocol, for users of 3T MRI systems from the three main manufacturers: GE, Philips and Siemens. The protocol provides guidance for assessing SC macrostructural and microstructural integrity: T1-weighted and T2-weighted imaging for SC cross-sectional area computation, multi-echo gradient echo for gray matter cross-sectional area, and magnetization transfer and diffusion weighted imaging for assessing white matter microstructure. In a companion paper from the same authors, the spine generic protocol was used to acquire data across 42 centers in 260 healthy subjects. The key details of the spine generic protocol are also available in an open-access document that can be found at https://github.com/spine-generic/protocols . The protocol will serve as a starting point for researchers and clinicians implementing new SC imaging initiatives so that, in the future, inclusion of the SC in neuroimaging protocols will be more common. The protocol could be implemented by any trained MR technician or by a researcher/clinician familiar with MRI acquisition

    Open-access quantitative MRI data of the spinal cord and reproducibility across participants, sites and manufacturers (vol 8, 219, 2021)

    Get PDF
    In a companion paper by Cohen-Adad et al. we introduce the spine generic quantitative MRI protocol that provides valuable metrics for assessing spinal cord macrostructural and microstructural integrity. This protocol was used to acquire a single subject dataset across 19 centers and a multi-subject dataset across 42 centers (for a total of 260 participants), spanning the three main MRI manufacturers: GE, Philips and Siemens. Both datasets are publicly available via git-annex. Data were analysed using the Spinal Cord Toolbox to produce normative values as well as inter/intra-site and inter/intra-manufacturer statistics. Reproducibility for the spine generic protocol was high across sites and manufacturers, with an average inter-site coefficient of variation of less than 5% for all the metrics. Full documentation and results can be found at https://spine-generic.rtfd.io/ . The datasets and analysis pipeline will help pave the way towards accessible and reproducible quantitative MRI in the spinal cord.</p

    Open-access quantitative MRI data of the spinal cord and reproducibility across participants, sites and manufacturers

    Get PDF
    In a companion paper by Cohen-Adad et al. we introduce the spine generic quantitative MRI protocol that provides valuable metrics for assessing spinal cord macrostructural and microstructural integrity. This protocol was used to acquire a single subject dataset across 19 centers and a multi-subject dataset across 42 centers (for a total of 260 participants), spanning the three main MRI manufacturers: GE, Philips and Siemens. Both datasets are publicly available via git-annex. Data were analysed using the Spinal Cord Toolbox to produce normative values as well as inter/intra-site and inter/intra-manufacturer statistics. Reproducibility for the spine generic protocol was high across sites and manufacturers, with an average inter-site coefficient of variation of less than 5% for all the metrics. Full documentation and results can be found at https://spine-generic.rtfd.io/. The datasets and analysis pipeline will help pave the way towards accessible and reproducible quantitative MRI in the spinal cord

    The James Webb Space Telescope Mission

    No full text
    Twenty-six years ago a small committee report, building on earlier studies, expounded a compelling and poetic vision for the future of astronomy, calling for an infrared-optimized space telescope with an aperture of at least 4 m. With the support of their governments in the US, Europe, and Canada, 20,000 people realized that vision as the 6.5 m James Webb Space Telescope. A generation of astronomers will celebrate their accomplishments for the life of the mission, potentially as long as 20 yr, and beyond. This report and the scientific discoveries that follow are extended thank-you notes to the 20,000 team members. The telescope is working perfectly, with much better image quality than expected. In this and accompanying papers, we give a brief history, describe the observatory, outline its objectives and current observing program, and discuss the inventions and people who made it possible. We cite detailed reports on the design and the measured performance on orbit
    corecore