4,029 research outputs found
Learning to Reason: Leveraging Neural Networks for Approximate DNF Counting
Weighted model counting (WMC) has emerged as a prevalent approach for
probabilistic inference. In its most general form, WMC is #P-hard. Weighted DNF
counting (weighted #DNF) is a special case, where approximations with
probabilistic guarantees are obtained in O(nm), where n denotes the number of
variables, and m the number of clauses of the input DNF, but this is not
scalable in practice. In this paper, we propose a neural model counting
approach for weighted #DNF that combines approximate model counting with deep
learning, and accurately approximates model counts in linear time when width is
bounded. We conduct experiments to validate our method, and show that our model
learns and generalizes very well to large-scale #DNF instances.Comment: To appear in Proceedings of the Thirty-Fourth AAAI Conference on
Artificial Intelligence (AAAI-20). Code and data available at:
https://github.com/ralphabb/NeuralDNF
Symbolic Exact Inference for Discrete Probabilistic Programs
The computational burden of probabilistic inference remains a hurdle for
applying probabilistic programming languages to practical problems of interest.
In this work, we provide a semantic and algorithmic foundation for efficient
exact inference on discrete-valued finite-domain imperative probabilistic
programs. We leverage and generalize efficient inference procedures for
Bayesian networks, which exploit the structure of the network to decompose the
inference task, thereby avoiding full path enumeration. To do this, we first
compile probabilistic programs to a symbolic representation. Then we adapt
techniques from the probabilistic logic programming and artificial intelligence
communities in order to perform inference on the symbolic representation. We
formalize our approach, prove it sound, and experimentally validate it against
existing exact and approximate inference techniques. We show that our inference
approach is competitive with inference procedures specialized for Bayesian
networks, thereby expanding the class of probabilistic programs that can be
practically analyzed
Approximate inference of marginals using the IBIA framework
Exact inference of marginals in probabilistic graphical models (PGM) is known
to be intractable, necessitating the use of approximate methods. Most of the
existing variational techniques perform iterative message passing in loopy
graphs which is slow to converge for many benchmarks. In this paper, we propose
a new algorithm for marginal inference that is based on the incremental
build-infer-approximate (IBIA) paradigm. Our algorithm converts the PGM into a
sequence of linked clique tree forests (SLCTF) with bounded clique sizes, and
then uses a heuristic belief update algorithm to infer the marginals. For the
special case of Bayesian networks, we show that if the incremental build step
in IBIA uses the topological order of variables then (a) the prior marginals
are consistent in all CTFs in the SLCTF and (b) the posterior marginals are
consistent once all evidence variables are added to the SLCTF. In our approach,
the belief propagation step is non-iterative and the accuracy-complexity
trade-off is controlled using user-defined clique size bounds. Results for
several benchmark sets from recent UAI competitions show that our method gives
either better or comparable accuracy than existing variational and sampling
based methods, with smaller runtimes
Bayesian Optimization for Probabilistic Programs
We present the first general purpose framework for marginal maximum a
posteriori estimation of probabilistic program variables. By using a series of
code transformations, the evidence of any probabilistic program, and therefore
of any graphical model, can be optimized with respect to an arbitrary subset of
its sampled variables. To carry out this optimization, we develop the first
Bayesian optimization package to directly exploit the source code of its
target, leading to innovations in problem-independent hyperpriors, unbounded
optimization, and implicit constraint satisfaction; delivering significant
performance improvements over prominent existing packages. We present
applications of our method to a number of tasks including engineering design
and parameter optimization
- …