165 research outputs found
Component Caching in Hybrid Domains with Piecewise Polynomial Densities
Counting the models of a propositional formula is an important problem: for example, it serves as the backbone of probabilistic inference by weighted model counting. A key algorithmic insight is component caching (CC), in which disjoint components of a formula, generated dynamically during a DPLL search, are cached so that they only have to be solved once. In the recent years, driven by SMT technology and probabilistic inference in hybrid domains, there is an increasing interest in counting the models of linear arithmetic sentences. To date, however, solvers for these are block-clause implementations, which are nonviable on large problem instances. In this paper, as a first step in extending CC to hybrid domains, we show how propositional CC systems can be leveraged when limited to piecewise polynomial densities. Our experiments demonstrate a large gap in performance when compared to existing approaches based on a variety of block-clause strategies
Scaling up Probabilistic Inference in Linear and Non-Linear Hybrid Domains by Leveraging Knowledge Compilation.
Weighted model integration (WMI) extends weighted model counting (WMC) in
providing a computational abstraction for probabilistic inference in mixed
discrete-continuous domains. WMC has emerged as an assembly language for
state-of-the-art reasoning in Bayesian networks, factor graphs, probabilistic
programs and probabilistic databases. In this regard, WMI shows immense promise
to be much more widely applicable, especially as many real-world applications
involve attribute and feature spaces that are continuous and mixed.
Nonetheless, state-of-the-art tools for WMI are limited and less mature than
their propositional counterparts. In this work, we propose a new implementation
regime that leverages propositional knowledge compilation for scaling up
inference. In particular, we use sentential decision diagrams, a tractable
representation of Boolean functions, as the underlying model counting and model
enumeration scheme. Our regime performs competitively to state-of-the-art WMI
systems but is also shown to handle a specific class of non-linear constraints
over non-linear potentials.Comment: In proceedings of ICAART, 2020. A version also appears in AAAI
Workshop: Statistical Relational Artificial Intelligence (StarAI), 202
Probabilistic Inference in Hybrid Domains by Weighted Model Integration
Weighted model counting (WMC) on a propositional knowledge base is an effective and general approach to probabilistic inference in a variety of formalisms, includ-ing Bayesian and Markov Networks. However, an in-herent limitation of WMC is that it only admits the in-ference of discrete probability distributions. In this pa-per, we introduce a strict generalization of WMC called weighted model integration that is based on annotating Boolean and arithmetic constraints, and combinations thereof. This methodology is shown to capture discrete, continuous and hybrid Markov networks. We then con-sider the task of parameter learning for a fragment of the language. An empirical evaluation demonstrates the ap-plicability and promise of the proposal.
- …