In Bayesian inference, a simple and popular approach to reduce the burden of
computing high dimensional integrals against a posterior π is to make the
Laplace approximation γ^​. This is a Gaussian distribution, so
computing ∫fdπ via the approximation ∫fdγ^​ is
significantly less expensive. In this paper, we make two general contributions
to the topic of high-dimensional Laplace approximations, as well as a third
contribution specific to a logistic regression model. First, we tighten the
dimension dependence of the error ∣∫fdπ−∫fdγ^​∣ for a
broad class of functions f. Second, we derive a higher-accuracy approximation
γ^​S​ to π, which is a skew-adjusted modification to γ^​.
Our third contribution - in the setting of Bayesian inference for logistic
regression with Gaussian design - is to use the first two results to derive
upper bounds which hold uniformly over different sample realizations, and lower
bounds on the Laplace mean approximation error. In particular, we prove a
skewed Bernstein-von Mises Theorem in this logistic regression setting