41 research outputs found

    Understanding predictive uncertainty in hydrologic modeling: The challenge of identifying input and structural errors

    Get PDF
    Meaningful quantification of data and structural uncertainties in conceptual rainfall-runoff modeling is a major scientific and engineering challenge. This paper focuses on the total predictive uncertainty and its decomposition into input and structural components under different inference scenarios. Several Bayesian inference schemes are investigated, differing in the treatment of rainfall and structural uncertainties, and in the precision of the priors describing rainfall uncertainty. Compared with traditional lumped additive error approaches, the quantification of the total predictive uncertainty in the runoff is improved when rainfall and/or structural errors are characterized explicitly. However, the decomposition of the total uncertainty into individual sources is more challenging. In particular, poor identifiability may arise when the inference scheme represents rainfall and structural errors using separate probabilistic models. The inference becomes ill‐posed unless sufficiently precise prior knowledge of data uncertainty is supplied; this ill‐posedness can often be detected from the behavior of the Monte Carlo sampling algorithm. Moreover, the priors on the data quality must also be sufficiently accurate if the inference is to be reliable and support meaningful uncertainty decomposition. Our findings highlight the inherent limitations of inferring inaccurate hydrologic models using rainfall‐runoff data with large unknown errors. Bayesian total error analysis can overcome these problems using independent prior information. The need for deriving independent descriptions of the uncertainties in the input and output data is clearly demonstrated.Benjamin Renard, Dmitri Kavetski, George Kuczera, Mark Thyer, and Stewart W. Frank

    A limited memory acceleration strategy for MCMC sampling in hierarchical Bayesian calibration of hydrological models

    Get PDF
    Hydrological calibration and prediction using conceptual models is affected by forcing/response data uncertainty and structural model error. The Bayesian Total Error Analysis methodology uses a hierarchical representation of individual sources of uncertainty. However, it is shown that standard multiblock “Metropolis-within-Gibbs” Markov chain Monte Carlo (MCMC) samplers commonly used in Bayesian hierarchical inference are exceedingly computationally expensive when applied to hydrologic models, which use recursive numerical solutions of coupled nonlinear differential equations to describe the evolution of catchment states such as soil and groundwater storages. This note develops a “limited-memory” algorithm for accelerating multiblock MCMC sampling from the posterior distributions of such models using low-dimensional jump distributions. The new algorithm exploits the decaying memory of hydrological systems to provide accurate tolerance-based approximations of traditional “full-memory” MCMC methods and is orders of magnitude more efficient than the latter.George Kuczera, Dmitri Kavetski, Benjamin Renard and Mark Thye

    Ancient numerical daemons of conceptual hydrological modeling 2. Impact of time stepping schemes on model analysis and prediction

    Get PDF
    Despite the widespread use of conceptual hydrological models in environmental research and operations, they remain frequently implemented using numerically unreliable methods. This paper considers the impact of the time stepping scheme on model analysis (sensitivity analysis, parameter optimization, and Markov chain Monte Carlo-based uncertainty estimation) and prediction. It builds on the companion paper (Clark and Kavetski, 2010), which focused on numerical accuracy, fidelity, and computational efficiency. Empirical and theoretical analysis of eight distinct time stepping schemes for six different hydrological models in 13 diverse basins demonstrates several critical conclusions. (1) Unreliable time stepping schemes, in particular, fixed-step explicit methods, suffer from troublesome numerical artifacts that severely deform the objective function of the model. These deformations are not rare isolated instances but can arise in any model structure, in any catchment, and under common hydroclimatic conditions. (2) Sensitivity analysis can be severely contaminated by numerical errors, often to the extent that it becomes dominated by the sensitivity of truncation errors rather than the model equations. (3) Robust time stepping schemes generally produce "better behaved" objective functions, free of spurious local optima, and with sufficient numerical continuity to permit parameter optimization using efficient quasi Newton methods. When implemented within a multistart framework, modern Newton-type optimizers are robust even when started far from the optima and provide valuable diagnostic insights not directly available from evolutionary global optimizers. (4) Unreliable time stepping schemes lead to inconsistent and biased inferences of the model parameters and internal states. (5) Even when interactions between hydrological parameters and numerical errors provide "the right result for the wrong reason" and the calibrated model performance appears adequate, unreliable time stepping schemes make the model unnecessarily fragile in predictive mode, undermining validation assessments and operational use. Erroneous or misleading conclusions of model analysis and prediction arising from numerical artifacts in hydrological models are intolerable, especially given that robust numerics are accepted as mainstream in other areas of science and engineering. We hope that the vivid empirical findings will encourage the conceptual hydrological community to close its Pandora's box of numerical problems, paving the way for more meaningful model application and interpretation. Copyright 2010 by the American Geophysical Union.Dmitri Kavetski and Martyn P. Clar

    Critical evaluation of parameter consistency and predictive uncertainty in hydrological modeling: A case study using Bayesian total error analysis

    Get PDF
    The lack of a robust framework for quantifying the parametric and predictive uncertainty of conceptual rainfall‐runoff (CRR) models remains a key challenge in hydrology. The Bayesian total error analysis (BATEA) methodology provides a comprehensive framework to hypothesize, infer, and evaluate probability models describing input, output, and model structural error. This paper assesses the ability of BATEA and standard calibration approaches (standard least squares (SLS) and weighted least squares (WLS)) to address two key requirements of uncertainty assessment: (1) reliable quantification of predictive uncertainty and (2) reliable estimation of parameter uncertainty. The case study presents a challenging calibration of the lumped GR4J model to a catchment with ephemeral responses and large rainfall gradients. Postcalibration diagnostics, including checks of predictive distributions using quantile‐quantile analysis, suggest that while still far from perfect, BATEA satisfied its assumed probability models better than SLS and WLS. In addition, WLS/SLS parameter estimates were highly dependent on the selected rain gauge and calibration period. This will obscure potential relationships between CRR parameters and catchment attributes and prevent the development of meaningful regional relationships. Conversely, BATEA provided consistent, albeit more uncertain, parameter estimates and thus overcomes one of the obstacles to parameter regionalization. However, significant departures from the calibration assumptions remained even in BATEA, e.g., systematic overestimation of predictive uncertainty, especially in validation. This is likely due to the inferred rainfall errors compensating for simplified treatment of model structural error.Mark Thyer, Benjamin Renard, Dmitri Kavetski, George Kuczera, Stewart William Franks and Sri Srikantha

    Elements of a flexible approach for conceptual hydrological modeling: 2. Application and experimental insights

    Get PDF
    In this article's companion paper, flexible approaches for conceptual hydrological modeling at the catchment scale were motivated, and the SUPERFLEX framework, based on generic model components, was introduced. In this article, the SUPERFLEX framework and the “fixed structure” GR4H model (an hourly version of the popular GR4J model) are applied to four hydrologically distinct experimental catchments in Europe and New Zealand. The estimated models are scrutinized using several diagnostic measures, ranging from statistical metrics, such as the statistical reliability and precision of the predictive distribution of streamflow, to more process-oriented diagnostics based on flow-duration curves and the correspondence between model states and groundwater piezometers. Model performance was clearly catchment specific, with a single fixed structure unable to accommodate intercatchment differences in hydrological behavior, including seasonality and thresholds. This highlights an important limitation of any “fixed” model structure. In the experimental catchments, the ability of competing model hypotheses to reproduce hydrological signatures of interest could be interpreted on the basis of independent fieldwork insights. The potential of flexible frameworks such as SUPERFLEX is then examined with respect to systematic and stringent hypothesis-testing in hydrological modeling, for characterizing catchment diversity, and, more generally, for aiding progress toward a more unified formulation of hydrological theory at the catchment scale. When interpreted in physical process-oriented terms, the flexible approach can also serve as a language for dialogue between modeler and experimentalist, facilitating the understanding, representation, and interpretation of catchment behavior.Dmitri Kavetski and Fabrizio Fenici

    Toward a reliable decomposition of predictive uncertainty in hydrological modeling: Characterizing rainfall errors using conditional simulation

    Get PDF
    This study explores the decomposition of predictive uncertainty in hydrological modeling into its contributing sources. This is pursued by developing data-based probability models describing uncertainties in rainfall and runoff data and incorporating them into the Bayesian total error analysis methodology (BATEA). A case study based on the Yzeron catchment (France) and the conceptual rainfall-runoff model GR4J is presented. It exploits a calibration period where dense rain gauge data are available to characterize the uncertainty in the catchment average rainfall using geostatistical conditional simulation. The inclusion of information about rainfall and runoff data uncertainties overcomes ill-posedness problems and enables simultaneous estimation of forcing and structural errors as part of the Bayesian inference. This yields more reliable predictions than approaches that ignore or lump different sources of uncertainty in a simplistic way (e.g., standard least squares). It is shown that independently derived data quality estimates are needed to decompose the total uncertainty in the runoff predictions into the individual contributions of rainfall, runoff, and structural errors. In this case study, the total predictive uncertainty appears dominated by structural errors. Although further research is needed to interpret and verify this decomposition, it can provide strategic guidance for investments in environmental data collection and/or modeling improvement. More generally, this study demonstrates the power of the Bayesian paradigm to improve the reliability of environmental modeling using independent estimates of sampling and instrumental data uncertainties.Benjamin Renard, Dmitri Kavetski, Etienne Leblois, Mark Thyer, George Kuczera, Stewart W. Frank

    Impact of temporal data resolution on parameter inference and model identification in conceptual hydrological modeling: insights from an experimental catchment

    Get PDF
    This study presents quantitative and qualitative insights into the time scale dependencies of hydrological parameters, predictions and their uncertainties, and examines the impact of the time resolution of the calibration data on the identifiable system complexity. Data from an experimental basin (Weierbach, Luxembourg) is used to analyze four conceptual models of varying complexity, over time scales of 30 min to 3 days, using several combinations of numerical implementations and inference equations. Large spurious time scale trends arise in the parameter estimates when unreliable time-stepping approximations are employed and/or when the heteroscedasticity of the model residual errors is ignored. Conversely, the use of robust numerics and more adequate (albeit still clearly imperfect) likelihood functions markedly stabilizes and, in many cases, reduces the time scale dependencies and improves the identifiability of increasingly complex model structures. Parameters describing slow flow remained essentially constant over the range of subhourly to daily scales considered here, while parameters describing quick flow converged toward increasingly precise and stable estimates as the data resolution approached the characteristic time scale of these faster processes. These results are consistent with theoretical expectations based on numerical error analysis and dataaveraging considerations. Additional diagnostics confirmed the improved ability of the more complex models to reproduce distinct signatures in the observed data. More broadly, this study provides insights into the information content of hydrological data and, by advocating careful attention to robust numericostatistical analysis and stringent processoriented diagnostics, furthers the utilization of dense-resolution data and experimental insights to advance hypothesis-based hydrological modeling at the catchment scale.Dmitri Kavetski, Fabrizio Fenicia and Martyn P. Clar

    Hydrogenation of CO 2 by a Bifunctional PC( sp 3 )P Iridium(III) Pincer Complex Equipped with Tertiary Amine as a Functional Group

    No full text
    International audienceAbstract Reversible hydrogen storage in the form of stable and mostly harmless chemical substances such as formic acid (FA) is a cornerstone of a fossil fuels‐free economy. In the past, we have reported a primary amine‐functionalized bifunctional iridium(III)‐PC( sp 3 )P pincer complex as a mild and chemoselective catalyst for the additive‐free decomposition of neat formic acid. In this manuscript, we report on the successful application of a redesigned complex bearing tertiary amine functionality as a catalyst for mild hydrogenation of CO 2 to formic acid. The catalyst demonstrates TON up to 6×10 4 and TOF up to 1.7×10 4 h −1 . In addition to the practical value of the catalyst, experimental and computational mechanistic studies provide the rationale for the design of improved next‐generation catalysts
    corecore