13,907 research outputs found

    Reducing bias and quantifying uncertainty in watershed flux estimates: the R package loadflex

    Get PDF
    Many ecological insights into the function of rivers and watersheds emerge from quantifying the flux of solutes or suspended materials in rivers. Numerous methods for flux estimation have been described, and each has its strengths and weaknesses. Currently, the largest practical challenges in flux estimation are to select among these methods and to implement or apply whichever method is chosen. To ease this process of method selection and application, we have written an R software package called loadflex that implements several of the most popular methods for flux estimation, including regressions, interpolations, and the special case of interpolation known as the period-weighted approach. Our package also implements a lesser-known and empirically promising approach called the “composite method,” to which we have added an algorithm for estimating prediction uncertainty. Here we describe the structure and key features of loadflex, with a special emphasis on the rationale and details of our composite method implementation. We then demonstrate the use of loadflex by fitting four different models to nitrate data from the Lamprey River in southeastern New Hampshire, where two large floods in 2006–2007 are hypothesized to have driven a long-term shift in nitrate concentrations and fluxes from the watershed. The models each give believable estimates, and yet they yield different answers for whether and how the floods altered nitrate loads. In general, the best modeling approach for each new dataset will depend on the specific site and solute of interest, and researchers need to make an informed choice among the many possible models. Our package addresses this need by making it simple to apply and compare multiple load estimation models, ultimately allowing researchers to estimate riverine concentrations and fluxes with greater ease and accuracy

    Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    Get PDF
    Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output based on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.Comment: 28 pages, 4 figures, 3 table

    Multivariate adaptive regression splines for estimating riverine constituent concentrations

    Get PDF
    Regression-based methods are commonly used for riverine constituent concentration/flux estimation, which is essential for guiding water quality protection practices and environmental decision making. This paper developed a multivariate adaptive regression splines model for estimating riverine constituent concentrations (MARS-EC). The process, interpretability and flexibility of the MARS-EC modelling approach, was demonstrated for total nitrogen in the Patuxent River, a major river input to Chesapeake Bay. Model accuracy and uncertainty of the MARS-EC approach was further analysed using nitrate plus nitrite datasets from eight tributary rivers to Chesapeake Bay. Results showed that the MARS-EC approach integrated the advantages of both parametric and nonparametric regression methods, and model accuracy was demonstrated to be superior to the traditionally used ESTIMATOR model. MARS-EC is flexible and allows consideration of auxiliary variables; the variables and interactions can be selected automatically. MARS-EC does not constrain concentration-predictor curves to be constant but rather is able to identify shifts in these curves from mathematical expressions and visual graphics. The MARS-EC approach provides an effective and complementary tool along with existing approaches for estimating riverine constituent concentrations

    Treatment of bimodality in proficiency test of pH in bioethanol matrix

    Full text link
    The pH value in bioethanol is a quality control parameter related to its acidity and to the corrosiveness of vehicle engines when it is used as fuel. In order to verify the comparability and reliability of the measurement of pH in bioethanol matrix among some experienced chemical laboratories, reference material (RM) of bioethanol developed by Inmetro - the Brazilian National Metrology Institute - was used in a proficiency testing (PT) scheme. There was a difference of more than one unit in the value of the pH measured due to the type of internal filling electrolytic solutions (potassium chloride, KCl or lithium chloride, LiCl) from the commercial pH combination electrodes used by the participant laboratories. Therefore, bimodal distribution has occurred from the data of this PT scheme. This work aims to present the possibilities that a PT scheme provider can use to overcome the bimodality problem. Data from the PT of pH in bioethanol were treated by two different statistical approaches: kernel density model and the mixture of distributions. Application of these statistical treatments improved the initial diagnoses of PT provider, by solving bimodality problem and contributing for a better performance evaluation in measuring pH of bioethanol.Comment: 20 pages, 6 figures, Accepted for publication in Accreditation and Quality Assurance (ACQUAL

    Rapid Computation of Thermodynamic Properties Over Multidimensional Nonbonded Parameter Spaces using Adaptive Multistate Reweighting

    Full text link
    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over a thousand CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form. We also compute entropy, enthalpy, and radial distribution functions of unsampled parameter combinations using only the data from these sampled states and use the free energies estimates to examine the deviation of simulations from the Born approximation to the solvation free energy

    A review of applied methods in Europe for flood-frequency analysis in a changing environment

    Get PDF
    The report presents a review of methods used in Europe for trend analysis, climate change projections and non-stationary analysis of extreme precipitation and flood frequency. In addition, main findings of the analyses are presented, including a comparison of trend analysis results and climate change projections. Existing guidelines in Europe on design flood and design rainfall estimation that incorporate climate change are reviewed. The report concludes with a discussion of research needs on non-stationary frequency analysis for considering the effects of climate change and inclusion in design guidelines. Trend analyses are reported for 21 countries in Europe with results for extreme precipitation, extreme streamflow or both. A large number of national and regional trend studies have been carried out. Most studies are based on statistical methods applied to individual time series of extreme precipitation or extreme streamflow using the non-parametric Mann-Kendall trend test or regression analysis. Some studies have been reported that use field significance or regional consistency tests to analyse trends over larger areas. Some of the studies also include analysis of trend attribution. The studies reviewed indicate that there is some evidence of a general increase in extreme precipitation, whereas there are no clear indications of significant increasing trends at regional or national level of extreme streamflow. For some smaller regions increases in extreme streamflow are reported. Several studies from regions dominated by snowmelt-induced peak flows report decreases in extreme streamflow and earlier spring snowmelt peak flows. Climate change projections have been reported for 14 countries in Europe with results for extreme precipitation, extreme streamflow or both. The review shows various approaches for producing climate projections of extreme precipitation and flood frequency based on alternative climate forcing scenarios, climate projections from available global and regional climate models, methods for statistical downscaling and bias correction, and alternative hydrological models. A large number of the reported studies are based on an ensemble modelling approach that use several climate forcing scenarios and climate model projections in order to address the uncertainty on the projections of extreme precipitation and flood frequency. Some studies also include alternative statistical downscaling and bias correction methods and hydrological modelling approaches. Most studies reviewed indicate an increase in extreme precipitation under a future climate, which is consistent with the observed trend of extreme precipitation. Hydrological projections of peak flows and flood frequency show both positive and negative changes. Large increases in peak flows are reported for some catchments with rainfall-dominated peak flows, whereas a general decrease in flood magnitude and earlier spring floods are reported for catchments with snowmelt-dominated peak flows. The latter is consistent with the observed trends. The review of existing guidelines in Europe on design floods and design rainfalls shows that only few countries explicitly address climate change. These design guidelines are based on climate change adjustment factors to be applied to current design estimates and may depend on design return period and projection horizon. The review indicates a gap between the need for considering climate change impacts in design and actual published guidelines that incorporate climate change in extreme precipitation and flood frequency. Most of the studies reported are based on frequency analysis assuming stationary conditions in a certain time window (typically 30 years) representing current and future climate. There is a need for developing more consistent non-stationary frequency analysis methods that can account for the transient nature of a changing climate

    Probabilistic methods for seasonal forecasting in a changing climate: Cox-type regression models

    Get PDF
    For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept to other positive variables of interest beyond the time domain. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Niño/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictor
    corecore