30 research outputs found

    A Stochastic Covariance Shrinkage Approach in Ensemble Transform Kalman Filtering

    Get PDF
    The Ensemble Kalman Filters (EnKF) employ a Monte-Carlo approach to represent covariance information, and are affected by sampling errors in operational settings where the number of model realizations is much smaller than the model state dimension. To alleviate the effects of these errors EnKF relies on model-specific heuristics such as covariance localization, which takes advantage of the spatial locality of correlations among the model variables. This work proposes an approach to alleviate sampling errors that utilizes a locally averaged-in-time dynamics of the model, described in terms of a climatological covariance of the dynamical system. We use this covariance as the target matrix in covariance shrinkage methods, and develop a stochastic covariance shrinkage approach where synthetic ensemble members are drawn to enrich both the ensemble subspace and the ensemble transformation

    Non-linear model reduction for uncertainty quantification in large-scale inverse problems

    Full text link
    We present a model reduction approach to the solution of large-scale statistical inverse problems in a Bayesian inference setting. A key to the model reduction is an efficient representation of the non-linear terms in the reduced model. To achieve this, we present a formulation that employs masked projection of the discrete equations; that is, we compute an approximation of the non-linear term using a select subset of interpolation points. Further, through this formulation we show similarities among the existing techniques of gappy proper orthogonal decomposition, missing point estimation, and empirical interpolation via coefficient-function approximation. The resulting model reduction methodology is applied to a highly non-linear combustion problem governed by an advection–diffusion-reaction partial differential equation (PDE). Our reduced model is used as a surrogate for a finite element discretization of the non-linear PDE within the Markov chain Monte Carlo sampling employed by the Bayesian inference approach. In two spatial dimensions, we show that this approach yields accurate results while reducing the computational cost by several orders of magnitude. For the full three-dimensional problem, a forward solve using a reduced model that has high fidelity over the input parameter space is more than two million times faster than the full-order finite element model, making tractable the solution of the statistical inverse problem that would otherwise require many years of CPU time. Copyright © 2009 John Wiley & Sons, Ltd.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/65031/1/2746_ftp.pd

    Distributed Particle Filters for Data Assimilation in Simulation of Large Scale Spatial Temporal Systems

    Get PDF
    Assimilating real time sensor into a running simulation model can improve simulation results for simulating large-scale spatial temporal systems such as wildfire, road traffic and flood. Particle filters are important methods to support data assimilation. While particle filters can work effectively with sophisticated simulation models, they have high computation cost due to the large number of particles needed in order to converge to the true system state. This is especially true for large-scale spatial temporal simulation systems that have high dimensional state space and high computation cost by themselves. To address the performance issue of particle filter-based data assimilation, this dissertation developed distributed particle filters and applied them to large-scale spatial temporal systems. We first implemented a particle filter-based data assimilation framework and carried out data assimilation to estimate system state and model parameters based on an application of wildfire spread simulation. We then developed advanced particle routing methods in distributed particle filters to route particles among the Processing Units (PUs) after resampling in effective and efficient manners. In particular, for distributed particle filters with centralized resampling, we developed two routing policies named minimal transfer particle routing policy and maximal balance particle routing policy. For distributed PF with decentralized resampling, we developed a hybrid particle routing approach that combines the global routing with the local routing to take advantage of both. The developed routing policies are evaluated from the aspects of communication cost and data assimilation accuracy based on the application of data assimilation for large-scale wildfire spread simulations. Moreover, as cloud computing is gaining more and more popularity; we developed a parallel and distributed particle filter based on Hadoop & MapReduce to support large-scale data assimilation

    Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems

    Get PDF
    We present a model reduction approach to the solution of large-scale statistical inverse problems in a Bayesian inference setting. A key to the model reduction is an efficient representation of the non-linear terms in the reduced model. To achieve this, we present a formulation that employs masked projection of the discrete equations; that is, we compute an approximation of the non-linear term using a select subset of interpolation points. Further, through this formulation we show similarities among the existing techniques of gappy proper orthogonal decomposition, missing point estimation, and empirical interpolation via coefficient-function approximation. The resulting model reduction methodology is applied to a highly non-linear combustion problem governed by an advection–diffusion-reaction partial differential equation (PDE). Our reduced model is used as a surrogate for a finite element discretization of the non-linear PDE within the Markov chain Monte Carlo sampling employed by the Bayesian inference approach. In two spatial dimensions, we show that this approach yields accurate results while reducing the computational cost by several orders of magnitude. For the full three-dimensional problem, a forward solve using a reduced model that has high fidelity over the input parameter space is more than two million times faster than the full-order finite element model, making tractable the solution of the statistical inverse problem that would otherwise require many years of CPU time.MIT-Singapore Alliance. Computational Engineering ProgrammeUnited States. Air Force Office of Scientific Research (Contract Nos. FA9550-06-0271)National Science Foundation (U.S.) (Grant No. CNS-0540186)National Science Foundation (U.S.) (Grant No. CNS-0540372)Caja Madrid Foundation (Graduate Fellowship

    Data Assimilation Based on Sequential Monte Carlo Methods for Dynamic Data Driven Simulation

    Get PDF
    Simulation models are widely used for studying and predicting dynamic behaviors of complex systems. Inaccurate simulation results are often inevitable due to imperfect model and inaccurate inputs. With the advances of sensor technology, it is possible to collect large amount of real time observation data from real systems during simulations. This gives rise to a new paradigm of Dynamic Data Driven Simulation (DDDS) where a simulation system dynamically assimilates real time observation data into a running model to improve simulation results. Data assimilation for DDDS is a challenging task because sophisticated simulation models often have: 1) nonlinear non-Gaussian behavior 2) non-analytical expressions of involved probability density functions 3) high dimensional state space 4) high computation cost. Due to these properties, most existing data assimilation methods fail to effectively support data assimilation for DDDS in one way or another. This work develops algorithms and software to perform data assimilation for dynamic data driven simulation through non-parametric statistic inference based on sequential Monte Carlo (SMC) methods (also called particle filters). A bootstrap particle filter based data assimilation framework is firstly developed, where the proposal distribution is constructed from simulation models and statistical cores of noises. The bootstrap particle filter-based framework is relatively easy to implement. However, it is ineffective when the uncertainty of simulation models is much larger than the observation model (i.e. peaked likelihood) or when rare events happen. To improve the effectiveness of data assimilation, a new data assimilation framework, named as the SenSim framework, is then proposed, which has a more advanced proposal distribution that uses knowledge from both simulation models and sensor readings. Both the bootstrap particle filter-based framework and the SenSim framework are applied and evaluated in two case studies: wildfire spread simulation, and lane-based traffic simulation. Experimental results demonstrate the effectiveness of the proposed data assimilation methods. A software package is also created to encapsulate the different components of SMC methods for supporting data assimilation of general simulation models
    corecore