2,716 research outputs found

    Multi-source data assimilation for physically based hydrological modeling of an experimental hillslope

    Get PDF
    Data assimilation has recently been the focus of much attention for integrated surface–subsurface hydrological models, whereby joint assimilation of water table, soil moisture, and river discharge measurements with the ensemble Kalman filter (EnKF) has been extensively applied. Although the EnKF has been specifically developed to deal with nonlinear models, integrated hydrological models based on the Richards equation still represent a challenge, due to strong nonlinearities that may significantly affect the filter performance. Thus, more studies are needed to investigate the capabilities of the EnKF to correct the system state and identify parameters in cases where the unsaturated zone dynamics are dominant, as well as to quantify possible tradeoffs associated with assimilation of multi-source data. Here, the CATHY (CATchment HYdrology) model is applied to reproduce the hydrological dynamics observed in an experimental two-layered hillslope, equipped with tensiometers, water content reflectometer probes, and tipping bucket flow gages to monitor the hillslope response to a series of artificial rainfall events. Pressure head, soil moisture, and subsurface outflow are assimilated with the EnKF in a number of scenarios and the challenges and issues arising from the assimilation of multi-source data in this real-world test case are discussed. Our results demonstrate that the EnKF is able to effectively correct states and parameters even in a real application characterized by strong nonlinearities. However, multi-source data assimilation may lead to significant tradeoffs: the assimilation of additional variables can lead to degradation of model predictions for other variables that are otherwise well reproduced. Furthermore, we show that integrated observations such as outflow discharge cannot compensate for the lack of well-distributed data in heterogeneous hillslopes.</p

    Parameter estimation by implicit sampling

    Full text link
    Implicit sampling is a weighted sampling method that is used in data assimilation, where one sequentially updates estimates of the state of a stochastic model based on a stream of noisy or incomplete data. Here we describe how to use implicit sampling in parameter estimation problems, where the goal is to find parameters of a numerical model, e.g.~a partial differential equation (PDE), such that the output of the numerical model is compatible with (noisy) data. We use the Bayesian approach to parameter estimation, in which a posterior probability density describes the probability of the parameter conditioned on data and compute an empirical estimate of this posterior with implicit sampling. Our approach generates independent samples, so that some of the practical difficulties one encounters with Markov Chain Monte Carlo methods, e.g.~burn-in time or correlations among dependent samples, are avoided. We describe a new implementation of implicit sampling for parameter estimation problems that makes use of multiple grids (coarse to fine) and BFGS optimization coupled to adjoint equations for the required gradient calculations. The implementation is "dimension independent", in the sense that a well-defined finite dimensional subspace is sampled as the mesh used for discretization of the PDE is refined. We illustrate the algorithm with an example where we estimate a diffusion coefficient in an elliptic equation from sparse and noisy pressure measurements. In the example, dimension\slash mesh-independence is achieved via Karhunen-Lo\`{e}ve expansions

    Imaging of a fluid injection process using geophysical data - A didactic example

    Get PDF
    In many subsurface industrial applications, fluids are injected into or withdrawn from a geologic formation. It is of practical interest to quantify precisely where, when, and by how much the injected fluid alters the state of the subsurface. Routine geophysical monitoring of such processes attempts to image the way that geophysical properties, such as seismic velocities or electrical conductivity, change through time and space and to then make qualitative inferences as to where the injected fluid has migrated. The more rigorous formulation of the time-lapse geophysical inverse problem forecasts how the subsurface evolves during the course of a fluid-injection application. Using time-lapse geophysical signals as the data to be matched, the model unknowns to be estimated are the multiphysics forward-modeling parameters controlling the fluid-injection process. Properly reproducing the geophysical signature of the flow process, subsequent simulations can predict the fluid migration and alteration in the subsurface. The dynamic nature of fluid-injection processes renders imaging problems more complex than conventional geophysical imaging for static targets. This work intents to clarify the related hydrogeophysical parameter estimation concepts

    Ensemble Kalman Filter Assimilation of ERT Data for Numerical Modeling of Seawater Intrusion in a Laboratory Experiment

    Get PDF
    Seawater intrusion in coastal aquifers is a worldwide problem exacerbated by aquifer overexploitation and climate changes. To limit the deterioration of water quality caused by saline intrusion, research studies are needed to identify and assess the performance of possible countermeasures, e.g., underground barriers. Within this context, numerical models are fundamental to fully understand the process and for evaluating the effectiveness of the proposed solutions to contain the saltwater wedge; on the other hand, they are typically affected by uncertainty on hydrogeological parameters, as well as initial and boundary conditions. Data assimilation methods such as the ensemble Kalman filter (EnKF) represent promising tools that can reduce such uncertainties. Here, we present an application of the EnKF to the numerical modeling of a laboratory experiment where seawater intrusion was reproduced in a specifically designed sandbox and continuously monitored with electrical resistivity tomography (ERT). Combining EnKF and the SUTRA model for the simulation of density-dependent flow and transport in porous media, we assimilated the collected ERT data by means of joint and sequential assimilation approaches. In the joint approach, raw ERT data (electrical resistances) are assimilated to update both salt concentration and soil parameters, without the need for an electrical inversion. In the sequential approach, we assimilated electrical conductivities computed from a previously performed electrical inversion. Within both approaches, we suggest dual-step update strategies to minimize the effects of spurious correlations in parameter estimation. The results show that, in both cases, ERT data assimilation can reduce the uncertainty not only on the system state in terms of salt concentration, but also on the most relevant soil parameters, i.e., saturated hydraulic conductivity and longitudinal dispersivity. However, the sequential approach is more prone to filter inbreeding due to the large number of observations assimilated compared to the ensemble size

    Estimating model evidence using data assimilation

    Get PDF
    We review the field of data assimilation (DA) from a Bayesian perspective and show that, in addition to its by now common application to state estimation, DA may be used for model selection. An important special case of the latter is the discrimination between a factual model–which corresponds, to the best of the modeller's knowledge, to the situation in the actual world in which a sequence of events has occurred–and a counterfactual model, in which a particular forcing or process might be absent or just quantitatively different from the actual world. Three different ensemble‐DA methods are reviewed for this purpose: the ensemble Kalman filter (EnKF), the ensemble four‐dimensional variational smoother (En‐4D‐Var), and the iterative ensemble Kalman smoother (IEnKS). An original contextual formulation of model evidence (CME) is introduced. It is shown how to apply these three methods to compute CME, using the approximated time‐dependent probability distribution functions (pdfs) each of them provide in the process of state estimation. The theoretical formulae so derived are applied to two simplified nonlinear and chaotic models: (i) the Lorenz three‐variable convection model (L63), and (ii) the Lorenz 40‐variable midlatitude atmospheric dynamics model (L95). The numerical results of these three DA‐based methods and those of an integration based on importance sampling are compared. It is found that better CME estimates are obtained by using DA, and the IEnKS method appears to be best among the DA methods. Differences among the performance of the three DA‐based methods are discussed as a function of model properties. Finally, the methodology is implemented for parameter estimation and for event attribution
    • 

    corecore