2,163 research outputs found
Fixed-Form Variational Posterior Approximation through Stochastic Linear Regression
We propose a general algorithm for approximating nonstandard Bayesian
posterior distributions. The algorithm minimizes the Kullback-Leibler
divergence of an approximating distribution to the intractable posterior
distribution. Our method can be used to approximate any posterior distribution,
provided that it is given in closed form up to the proportionality constant.
The approximation can be any distribution in the exponential family or any
mixture of such distributions, which means that it can be made arbitrarily
precise. Several examples illustrate the speed and accuracy of our
approximation method in practice
Automatic Differentiation Variational Inference
Probabilistic modeling is iterative. A scientist posits a simple model, fits
it to her data, refines it according to her analysis, and repeats. However,
fitting complex models to large data is a bottleneck in this process. Deriving
algorithms for new models can be both mathematically and computationally
challenging, which makes it difficult to efficiently cycle through the steps.
To this end, we develop automatic differentiation variational inference (ADVI).
Using our method, the scientist only provides a probabilistic model and a
dataset, nothing else. ADVI automatically derives an efficient variational
inference algorithm, freeing the scientist to refine and explore many models.
ADVI supports a broad class of models-no conjugacy assumptions are required. We
study ADVI across ten different models and apply it to a dataset with millions
of observations. ADVI is integrated into Stan, a probabilistic programming
system; it is available for immediate use
Automatic Variational Inference in Stan
Variational inference is a scalable technique for approximate Bayesian
inference. Deriving variational inference algorithms requires tedious
model-specific calculations; this makes it difficult to automate. We propose an
automatic variational inference algorithm, automatic differentiation
variational inference (ADVI). The user only provides a Bayesian model and a
dataset; nothing else. We make no conjugacy assumptions and support a broad
class of models. The algorithm automatically determines an appropriate
variational family and optimizes the variational objective. We implement ADVI
in Stan (code available now), a probabilistic programming framework. We compare
ADVI to MCMC sampling across hierarchical generalized linear models,
nonconjugate matrix factorization, and a mixture model. We train the mixture
model on a quarter million images. With ADVI we can use variational inference
on any model we write in Stan
- …