479 research outputs found

    Data-Driven Model Reduction for the Bayesian Solution of Inverse Problems

    Get PDF
    One of the major challenges in the Bayesian solution of inverse problems governed by partial differential equations (PDEs) is the computational cost of repeatedly evaluating numerical PDE models, as required by Markov chain Monte Carlo (MCMC) methods for posterior sampling. This paper proposes a data-driven projection-based model reduction technique to reduce this computational cost. The proposed technique has two distinctive features. First, the model reduction strategy is tailored to inverse problems: the snapshots used to construct the reduced-order model are computed adaptively from the posterior distribution. Posterior exploration and model reduction are thus pursued simultaneously. Second, to avoid repeated evaluations of the full-scale numerical model as in a standard MCMC method, we couple the full-scale model and the reduced-order model together in the MCMC algorithm. This maintains accurate inference while reducing its overall computational cost. In numerical experiments considering steady-state flow in a porous medium, the data-driven reduced-order model achieves better accuracy than a reduced-order model constructed using the classical approach. It also improves posterior sampling efficiency by several orders of magnitude compared to a standard MCMC method

    A Repelling-Attracting Metropolis Algorithm for Multimodality

    Full text link
    Although the Metropolis algorithm is simple to implement, it often has difficulties exploring multimodal distributions. We propose the repelling-attracting Metropolis (RAM) algorithm that maintains the simple-to-implement nature of the Metropolis algorithm, but is more likely to jump between modes. The RAM algorithm is a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes repelling, followed by an uphill move in density that aims to make local modes attracting. The downhill move is achieved via a reciprocal Metropolis ratio so that the algorithm prefers downward movement. The uphill move does the opposite using the standard Metropolis ratio which prefers upward movement. This down-up movement in density increases the probability of a proposed move to a different mode. Because the acceptance probability of the proposal involves a ratio of intractable integrals, we introduce an auxiliary variable which creates a term in the acceptance probability that cancels with the intractable ratio. Using several examples, we demonstrate the potential for the RAM algorithm to explore a multimodal distribution more efficiently than a Metropolis algorithm and with less tuning than is commonly required by tempering-based methods

    Multilevel Delayed Acceptance MCMC

    Full text link
    We develop a novel Markov chain Monte Carlo (MCMC) method that exploits a hierarchy of models of increasing complexity to efficiently generate samples from an unnormalized target distribution. Broadly, the method rewrites the Multilevel MCMC approach of Dodwell et al. (2015) in terms of the Delayed Acceptance (DA) MCMC of Christen & Fox (2005). In particular, DA is extended to use a hierarchy of models of arbitrary depth, and allow subchains of arbitrary length. We show that the algorithm satisfies detailed balance, hence is ergodic for the target distribution. Furthermore, multilevel variance reduction is derived that exploits the multiple levels and subchains, and an adaptive multilevel correction to coarse-level biases is developed. Three numerical examples of Bayesian inverse problems are presented that demonstrate the advantages of these novel methods. The software and examples are available in PyMC3.Comment: 29 pages, 12 figure

    Efficient posterior sampling for high-dimensional imbalanced logistic regression

    Full text link
    High-dimensional data are routinely collected in many areas. We are particularly interested in Bayesian classification models in which one or more variables are imbalanced. Current Markov chain Monte Carlo algorithms for posterior computation are inefficient as nn and/or pp increase due to worsening time per step and mixing rates. One strategy is to use a gradient-based sampler to improve mixing while using data sub-samples to reduce per-step computational complexity. However, usual sub-sampling breaks down when applied to imbalanced data. Instead, we generalize piece-wise deterministic Markov chain Monte Carlo algorithms to include importance-weighted and mini-batch sub-sampling. These approaches maintain the correct stationary distribution with arbitrarily small sub-samples, and substantially outperform current competitors. We provide theoretical support and illustrate gains in simulated and real data applications.Comment: 4 figure

    MCMC methods for inference in a mathematical model of pulmonary circulation

    Get PDF
    This study performs parameter inference in a partial differential equations system of pulmonary circulation. We use a fluid dynamics network model that takes selected parameter values and mimics the behaviour of the pulmonary haemodynamics under normal physiological and pathological conditions. This is of medical interest as it enables tracking the progression of pulmonary hypertension. We show how we make the fluids model tractable by reducing the parameter dimension from a 55D to a 5D problem. The Delayed Rejection Adaptive Metropolis algorithm, coupled with constraint nonā€linear optimization, is successfully used to learn the parameter values and quantify the uncertainty in the parameter estimates. To accommodate for different magnitudes of the parameter values, we introduce an improved parameter scaling technique in the Delayed Rejection Adaptive Metropolis algorithm. Formal convergence diagnostics are employed to check for convergence of the Markov chains. Additionally, we perform model selection using different information criteria, including Watanabe Akaike Information Criteria
    • ā€¦
    corecore