6,374 research outputs found
Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression
We propose a general algorithm for approximating nonstandard Bayesian
posterior distributions. The algorithm minimizes the Kullback-Leibler
divergence of an approximating distribution to the intractable posterior
distribution. Our method can be used to approximate any posterior distribution,
provided that it is given in closed form up to the proportionality constant.
The approximation can be any distribution in the exponential family or any
mixture of such distributions, which means that it can be made arbitrarily
precise. Several examples illustrate the speed and accuracy of our
approximation method in practice
Copula-like Variational Inference
This paper considers a new family of variational distributions motivated by
Sklar's theorem. This family is based on new copula-like densities on the
hypercube with non-uniform marginals which can be sampled efficiently, i.e.
with a complexity linear in the dimension of state space. Then, the proposed
variational densities that we suggest can be seen as arising from these
copula-like densities used as base distributions on the hypercube with Gaussian
quantile functions and sparse rotation matrices as normalizing flows. The
latter correspond to a rotation of the marginals with complexity . We provide some empirical evidence that such a variational family can
also approximate non-Gaussian posteriors and can be beneficial compared to
Gaussian approximations. Our method performs largely comparably to
state-of-the-art variational approximations on standard regression and
classification benchmarks for Bayesian Neural Networks.Comment: 33rd Conference on Neural Information Processing Systems (NeurIPS
2019), Vancouver, Canad
- …