1,106 research outputs found

    Semivariogram methods for modeling Whittle-Mat\'ern priors in Bayesian inverse problems

    Full text link
    We present a new technique, based on semivariogram methodology, for obtaining point estimates for use in prior modeling for solving Bayesian inverse problems. This method requires a connection between Gaussian processes with covariance operators defined by the Mat\'ern covariance function and Gaussian processes with precision (inverse-covariance) operators defined by the Green's functions of a class of elliptic stochastic partial differential equations (SPDEs). We present a detailed mathematical description of this connection. We will show that there is an equivalence between these two Gaussian processes when the domain is infinite -- for us, R2\mathbb{R}^2 -- which breaks down when the domain is finite due to the effect of boundary conditions on Green's functions of PDEs. We show how this connection can be re-established using extended domains. We then introduce the semivariogram method for estimating the Mat\'ern covariance parameters, which specify the Gaussian prior needed for stabilizing the inverse problem. Results are extended from the isotropic case to the anisotropic case where the correlation length in one direction is larger than another. Finally, we consider the situation where the correlation length is spatially dependent rather than constant. We implement each method in two-dimensional image inpainting test cases to show that it works on practical examples

    Analysis of the Gibbs sampler for hierarchical inverse problems

    Get PDF
    Many inverse problems arising in applications come from continuum models where the unknown parameter is a field. In practice the unknown field is discretized resulting in a problem in RN\mathbb{R}^N, with an understanding that refining the discretization, that is increasing NN, will often be desirable. In the context of Bayesian inversion this situation suggests the importance of two issues: (i) defining hyper-parameters in such a way that they are interpretable in the continuum limit N→∞N \to \infty and so that their values may be compared between different discretization levels; (ii) understanding the efficiency of algorithms for probing the posterior distribution, as a function of large N.N. Here we address these two issues in the context of linear inverse problems subject to additive Gaussian noise within a hierarchical modelling framework based on a Gaussian prior for the unknown field and an inverse-gamma prior for a hyper-parameter, namely the amplitude of the prior variance. The structure of the model is such that the Gibbs sampler can be easily implemented for probing the posterior distribution. Subscribing to the dogma that one should think infinite-dimensionally before implementing in finite dimensions, we present function space intuition and provide rigorous theory showing that as NN increases, the component of the Gibbs sampler for sampling the amplitude of the prior variance becomes increasingly slower. We discuss a reparametrization of the prior variance that is robust with respect to the increase in dimension; we give numerical experiments which exhibit that our reparametrization prevents the slowing down. Our intuition on the behaviour of the prior hyper-parameter, with and without reparametrization, is sufficiently general to include a broad class of nonlinear inverse problems as well as other families of hyper-priors.Comment: to appear, SIAM/ASA Journal on Uncertainty Quantificatio

    M 221.01: Introduction to Linear Algebra

    Get PDF

    M 540.B01: Numerical Methods for Computational and Data Science

    Get PDF

    M 221.02: Introduction to Linear Algebra

    Get PDF

    M 274.01: Introduction to Differential Equation (Applied Differential Equations)

    Get PDF

    M 273.01: Multivariable Calculus

    Get PDF

    M 221.01: Introduction to Linear Algebra

    Get PDF

    M 440.01: Numerical Analysis

    Get PDF
    • …
    corecore