Tight skew adjustment to the Laplace approximation in high dimensions

Abstract

In Bayesian inference, a simple and popular approach to reduce the burden of computing high dimensional integrals against a posterior π\pi is to make the Laplace approximation γ^\hat\gamma. This is a Gaussian distribution, so computing ∫fdπ\int fd\pi via the approximation ∫fdγ^\int fd\hat\gamma is significantly less expensive. In this paper, we make two general contributions to the topic of high-dimensional Laplace approximations, as well as a third contribution specific to a logistic regression model. First, we tighten the dimension dependence of the error ∣∫fdπ−∫fdγ^∣|\int fd\pi - \int fd\hat\gamma| for a broad class of functions ff. Second, we derive a higher-accuracy approximation γ^S\hat\gamma_S to π\pi, which is a skew-adjusted modification to γ^\hat\gamma. Our third contribution - in the setting of Bayesian inference for logistic regression with Gaussian design - is to use the first two results to derive upper bounds which hold uniformly over different sample realizations, and lower bounds on the Laplace mean approximation error. In particular, we prove a skewed Bernstein-von Mises Theorem in this logistic regression setting

    Similar works

    Full text

    thumbnail-image

    Available Versions