9,182 research outputs found
Symbolic Exact Inference for Discrete Probabilistic Programs
The computational burden of probabilistic inference remains a hurdle for
applying probabilistic programming languages to practical problems of interest.
In this work, we provide a semantic and algorithmic foundation for efficient
exact inference on discrete-valued finite-domain imperative probabilistic
programs. We leverage and generalize efficient inference procedures for
Bayesian networks, which exploit the structure of the network to decompose the
inference task, thereby avoiding full path enumeration. To do this, we first
compile probabilistic programs to a symbolic representation. Then we adapt
techniques from the probabilistic logic programming and artificial intelligence
communities in order to perform inference on the symbolic representation. We
formalize our approach, prove it sound, and experimentally validate it against
existing exact and approximate inference techniques. We show that our inference
approach is competitive with inference procedures specialized for Bayesian
networks, thereby expanding the class of probabilistic programs that can be
practically analyzed
Adaptive Neural Compilation
This paper proposes an adaptive neural-compilation framework to address the
problem of efficient program learning. Traditional code optimisation strategies
used in compilers are based on applying pre-specified set of transformations
that make the code faster to execute without changing its semantics. In
contrast, our work involves adapting programs to make them more efficient while
considering correctness only on a target input distribution. Our approach is
inspired by the recent works on differentiable representations of programs. We
show that it is possible to compile programs written in a low-level language to
a differentiable representation. We also show how programs in this
representation can be optimised to make them efficient on a target distribution
of inputs. Experimental results demonstrate that our approach enables learning
specifically-tuned algorithms for given data distributions with a high success
rate.Comment: Submitted to NIPS 2016, code and supplementary materials will be
available on author's pag
Evaluating probabilistic programming languages for simulating quantum correlations
This article explores how probabilistic programming can be used to simulate
quantum correlations in an EPR experimental setting. Probabilistic programs are
based on standard probability which cannot produce quantum correlations. In
order to address this limitation, a hypergraph formalism was programmed which
both expresses the measurement contexts of the EPR experimental design as well
as associated constraints. Four contemporary open source probabilistic
programming frameworks were used to simulate an EPR experiment in order to shed
light on their relative effectiveness from both qualitative and quantitative
dimensions. We found that all four probabilistic languages successfully
simulated quantum correlations. Detailed analysis revealed that no language was
clearly superior across all dimensions, however, the comparison does highlight
aspects that can be considered when using probabilistic programs to simulate
experiments in quantum physics.Comment: 24 pages, 8 figures, code is available at
https://github.com/askoj/bell-ppl
- …