1,585 research outputs found
Hamiltonian ABC
Approximate Bayesian computation (ABC) is a powerful and elegant framework
for performing inference in simulation-based models. However, due to the
difficulty in scaling likelihood estimates, ABC remains useful for relatively
low-dimensional problems. We introduce Hamiltonian ABC (HABC), a set of
likelihood-free algorithms that apply recent advances in scaling Bayesian
learning using Hamiltonian Monte Carlo (HMC) and stochastic gradients. We find
that a small number forward simulations can effectively approximate the ABC
gradient, allowing Hamiltonian dynamics to efficiently traverse parameter
spaces. We also describe a new simple yet general approach of incorporating
random seeds into the state of the Markov chain, further reducing the random
walk behavior of HABC. We demonstrate HABC on several typical ABC problems, and
show that HABC samples comparably to regular Bayesian inference using true
gradients on a high-dimensional problem from machine learning.Comment: Submission to UAI 201
Implicit Langevin Algorithms for Sampling From Log-concave Densities
For sampling from a log-concave density, we study implicit integrators
resulting from -method discretization of the overdamped Langevin
diffusion stochastic differential equation. Theoretical and algorithmic
properties of the resulting sampling methods for and a
range of step sizes are established. Our results generalize and extend prior
works in several directions. In particular, for , we prove
geometric ergodicity and stability of the resulting methods for all step sizes.
We show that obtaining subsequent samples amounts to solving a strongly-convex
optimization problem, which is readily achievable using one of numerous
existing methods. Numerical examples supporting our theoretical analysis are
also presented
- …