351 research outputs found
A Compilation Target for Probabilistic Programming Languages
Forward inference techniques such as sequential Monte Carlo and particle
Markov chain Monte Carlo for probabilistic programming can be implemented in
any programming language by creative use of standardized operating system
functionality including processes, forking, mutexes, and shared memory.
Exploiting this we have defined, developed, and tested a probabilistic
programming language intermediate representation language we call probabilistic
C, which itself can be compiled to machine code by standard compilers and
linked to operating system libraries yielding an efficient, scalable, portable
probabilistic programming compilation target. This opens up a new hardware and
systems research path for optimizing probabilistic programming systems.Comment: In Proceedings of the 31st International Conference on Machine
Learning (ICML), 201
Parameter elimination in particle Gibbs sampling
Bayesian inference in state-space models is challenging due to
high-dimensional state trajectories. A viable approach is particle Markov chain
Monte Carlo, combining MCMC and sequential Monte Carlo to form "exact
approximations" to otherwise intractable MCMC methods. The performance of the
approximation is limited to that of the exact method. We focus on particle
Gibbs and particle Gibbs with ancestor sampling, improving their performance
beyond that of the underlying Gibbs sampler (which they approximate) by
marginalizing out one or more parameters. This is possible when the parameter
prior is conjugate to the complete data likelihood. Marginalization yields a
non-Markovian model for inference, but we show that, in contrast to the general
case, this method still scales linearly in time. While marginalization can be
cumbersome to implement, recent advances in probabilistic programming have
enabled its automation. We demonstrate how the marginalized methods are viable
as efficient inference backends in probabilistic programming, and demonstrate
with examples in ecology and epidemiology
A New Approach to Probabilistic Programming Inference
We introduce and demonstrate a new approach to inference in expressive
probabilistic programming languages based on particle Markov chain Monte Carlo.
Our approach is simple to implement and easy to parallelize. It applies to
Turing-complete probabilistic programming languages and supports accurate
inference in models that make use of complex control flow, including stochastic
recursion. It also includes primitives from Bayesian nonparametric statistics.
Our experiments show that this approach can be more efficient than previously
introduced single-site Metropolis-Hastings methods.Comment: Updated version of the 2014 AISTATS paper (to reflect changes in new
language syntax). 10 pages, 3 figures. Proceedings of the Seventeenth
International Conference on Artificial Intelligence and Statistics, JMLR
Workshop and Conference Proceedings, Vol 33, 201
- …