177,389 research outputs found

    Quantifying uncertainty in an industrial approach : an emerging consensus in an old epistemological debate

    Get PDF
    Uncertainty is ubiquitous in modern decision-making supported by quantitative modeling. While uncertainty treatment has been initially largely developed in risk or environmental assessment, it is gaining large-spread interest in many industrial fields generating knowledge and practices going beyond the classical risk versus uncertainty or epistemic versus aleatory debates. On the basis of years of applied research in different sectors at the European scale, this paper discusses the emergence of a methodological consensus throughout a number of fields of engineering and applied science such as metrology, safety and reliability, protection against natural risk, manufacturing statistics, numerical design and scientific computing etc. In relation with the applicable regula-tion and standards and a relevant quantity of interest for decision-making, this approach involves in particular the proper identification of key steps such as : the quantification (or modeling) of the sources of uncertainty, possibly involving an inverse approach ; their propagation through a pre-existing physical-industrial model; the ranking of importance or sensitivity analysis and sometimes a subsequent optimisation step. It aims at giving a consistent and industrially-realistic framework for practical mathematical modeling, assumingly restricted to quantitative and quantifiable uncertainty, and illustrated on three typical examples. Axes of further research proving critical for the environmental or industrial issues are outlined: the information challenges posed by uncertainty modeling in the context of data scarcity, and the corresponding calibration and inverse probabilistic techniques, bound to be developed to best value industrial or environmental monitoring and data acquisition systems under uncertainty; the numerical challenges entailing considerable development of high-performance computing in the field; the acceptability challenges in the context of the precautionary principle

    Addressing Uncertainty in TMDLS: Short Course at Arkansas Water Resources Center 2001 Annual Conference

    Get PDF
    Management of a critical natural resource like water requires information on the status of that resource. The US Environmental Protection Agency (EPA) reported in the 1998 National Water Quality Inventory that more than 291,000 miles of assessed rivers and streams and 5 million acres of lakes do not meet State water quality standards. This inventory represents a compilation of State assessments of 840,000 miles of rivers and 17.4 million acres of lakes; a 22 percent increase in river miles and 4 percent increase in lake acres over their 1996 reports. Siltation, bacteria, nutrients and metals were the leading pollutants of impaired waters, according to EPA. The sources of these pollutants were presumed to be runoff from agricultural lands and urban areas. EPA suggests that the majority of Americans-over 218 million-live within ten miles of a polluted waterbody. This seems to contradict the recent proclamations of the success of the Clean Water Act, the Nation\u27s water pollution control law. EPA also claims that, while water quality is still threatened in the US, the amount of water safe for fishing and swimming has doubled since 1972, and that the number of people served by sewage treatment plants has more than doubled

    Vulnerability assessments of pesticide leaching to groundwater

    Get PDF
    Pesticides may have adverse environmental effects if they are transported to groundwater and surface waters. The vulnerability of water resources to contamination of pesticides must therefore be evaluated. Different stakeholders, with different objectives and requirements, are interested in such vulnerability assessments. Various assessment methods have been developed in the past. For example, the vulnerability of groundwater to pesticide leaching may be evaluated by indices and overlay-based methods, by statistical analyses of monitoring data, or by using process-based models of pesticide fate. No single tool or methodology is likely to be appropriate for all end-users and stakeholders, since their suitability depends on the available data and the specific goals of the assessment. The overall purpose of this thesis was to develop tools, based on different process-based models of pesticide leaching that may be used in groundwater vulnerability assessments. Four different tools have been developed for end-users with varying goals and interests: (i) a tool based on the attenuation factor implemented in a GIS, where vulnerability maps are generated for the islands of Hawaii (U.S.A.), (ii) a simulation tool based on the MACRO model developed to support decision-makers at local authorities to assess potential risks of leaching of pesticides to groundwater following normal usage in drinking water abstraction districts, (iii) linked models of the soil root zone and groundwater to investigate leaching of the pesticide mecoprop to shallow and deep groundwater in fractured till, and (iv) a meta-model of the pesticide fate model MACRO developed for 'worst-case' groundwater vulnerability assessments in southern Sweden. The strengths and weaknesses of the different approaches are discussed

    Water Quality Trading and Agricultural Nonpoint Source Pollution: An Analysis of the Effectiveness and Fairness of EPA's Policy on Water Quality Trading

    Get PDF
    Water quality problems continue to plague our nation, even though Congress passed the Clean Water Act (CWA) to "restore and maintain the chemical, physical, and biological integrity of the Nation's waters"1 more than three decades ago. During the past thirty years, the dominant sources of water pollution have changed, requiring us to seek new approaches for cleaning up our waters. Water quality trading has been heralded as an approach that can integrate market mechanisms into the effort of cleaning up our water. This Article examines the Environmental Protection Agency's (EPA) policy on water quality trading and the prospects for water quality trading to help improve water quality.Part II briefly describes our water quality problems and causes. Part III examines the theoretical basis for trading and the EPA's Water Quality Trading Policy. Part IV discusses the potential impact of total maximum daily loads (TMDLs) on water quality trading, and Part V analyzes potential problems that water quality trading programs confront. Part VI addresses distributional and efficiency concerns that arise when considering trading and agricultural nonpoint source pollution. Part VII then examines issues relating to water quality trading and state laws before reaching conclusions and recommendations in Part VIII

    Draft Regional Recommendations for the Pacific Northwest on Water Quality Trading

    Get PDF
    In March 2013, water quality agency staff from Idaho, Oregon, and Washington, U.S. EPA Region 10, Willamette Partnership, and The Freshwater Trust convened a working group for the first of a series of four interagency workshops on water quality trading in the Pacific Northwest. Facilitated by Willamette Partnership through a USDA-NRCS Conservation Innovation Grant, those who assembled over the subsequent eight months discussed and evaluated water quality trading policies, practices, and programs across the country in an effort to better understand and draw from EPA's January 13, 2003, Water Quality Trading Policy, and its 2007 Permit Writers' Toolkit, as well as existing state guidance and regulations on water quality trading. All documents presented at those conversations and meeting summaries are posted on the Willamette Partnership's website.The final product is intended to be a set of recommended practices for each state to consider as they develop water quality trading. The goals of this effort are to help ensure that water quality "trading programs" have the quality, credibility, and transparency necessary to be consistent with the "Clean Water Act" (CWA), its implementing regulations and state and local water quality laws

    Water Quality Trading and Offset Initiatives in the U.S.: A Comprehensive Survey

    Get PDF
    This document summarizes water quality trading and offset initiatives in the United States, including state-wide policies and recent proposals. The following format was used to present information on each program. We attempted to have each program summary reviewed by at least one contact person for program accuracy. In the cases where this review occurred, we added the statement "Reviewed by.." at the end of the case summary

    A comparison of statistical and machine learning methods for creating national daily maps of ambient PM2.5_{2.5} concentration

    Get PDF
    A typical problem in air pollution epidemiology is exposure assessment for individuals for which health data are available. Due to the sparsity of monitoring sites and the limited temporal frequency with which measurements of air pollutants concentrations are collected (for most pollutants, once every 3 or 6 days), epidemiologists have been moving away from characterizing ambient air pollution exposure solely using measurements. In the last few years, substantial research efforts have been placed in developing statistical methods or machine learning techniques to generate estimates of air pollution at finer spatial and temporal scales (daily, usually) with complete coverage. Some of these methods include: geostatistical techniques, such as kriging; spatial statistical models that use the information contained in air quality model outputs (statistical downscaling models); linear regression modeling approaches that leverage the information in GIS covariates (land use regression); or machine learning methods that mine the information contained in relevant variables (neural network and deep learning approaches). Although some of these exposure modeling approaches have been used in several air pollution epidemiological studies, it is not clear how much the predicted exposures generated by these methods differ, and which method generates more reliable estimates. In this paper, we aim to address this gap by evaluating a variety of exposure modeling approaches, comparing their predictive performance and computational difficulty. Using PM2.5_{2.5} in year 2011 over the continental U.S. as case study, we examine the methods' performances across seasons, rural vs urban settings, and levels of PM2.5_{2.5} concentrations (low, medium, high)
    corecore