23 research outputs found
Distilling importance sampling
The two main approaches to Bayesian inference are sampling and optimisation
methods. However many complicated posteriors are difficult to approximate by
either. Therefore we propose a novel approach combining features of both. We
use a flexible parameterised family of densities, such as a normalising flow.
Given a density from this family approximating the posterior, we use importance
sampling to produce a weighted sample from a more accurate posterior
approximation. This sample is then used in optimisation to update the
parameters of the approximate density, which we view as distilling the
importance sampling results. We iterate these steps and gradually improve the
quality of the posterior approximation. We illustrate our method in two
challenging examples: a queueing model and a stochastic differential equation
model.Comment: This version adds a second application, and fixes some minor error