52 research outputs found
Variational Bayes via Propositionalization
We propose a unified approach to VB (variational Bayes) in
symbolic-statistical modeling via propositionalization.
By propositionalization we mean, broadly, expressing and
computing probabilistic models such as BNs (Bayesian
networks) and PCFGs (probabilistic context free grammars)
in terms of propositional logic that considers
propositional variables as binary random variables.
Our proposal is motivated by three observations. The
first one is that PPC (propostionalized probability
computation), i.e. probability computation formalized in
a propositional setting, has turned out to be general and
efficient when variable values are sparsely
interdependent. Examples include (discrete) BNs, PCFGs
and more generally PRISM which is a Turing complete logic
programming language with EM learning ability we have been
developing, and computes probabilities using graphically
represented AND/OR boolean formulas. Efficiency of PPC is
classically testified by the Inside-Outside algorithm in
the case of PCFGs and by recent PPC approaches in the case
of BNs such as the one by Darwiche et al. that exploits
probability and CSI (context specific independence).
Dechter et al. also revealed that PPC is a general
computation scheme for BNs by their formulation of AND/OR
search spaces.
Second of all, while VB has been around for sometime as a
practically effective approach to Bayesian modeling, it\u27s
use is still somewhat restricted to simple models such as
BNs and HMMs (hidden Markov models) though its usefulness
is established through a variety of applications from
model selection to prediction. On the other hand it is
already proved that VB can be extended to PCFGs and is
efficiently implementable using dynamic programming. Note
that PCFGs are just one class of PPC and much more general
PPC is realized by PRISM. Accordingly if VB is extened to
PRISM\u27s PPC, we will obtain VB for general probabilistic
models, far wider than BNs and PCFGs.
The last observation is that once VB becomes available in
PRISM, it saves us a lot of time and energy. First we do
not have to derive a new VB algorithm from scratch for
each model and implement it. All we have to do is just to
write a probabilistic model at predicate level. The rest
of work will be carried out automatically in a unified
manner by the PRISM system as it happens in the case of EM
learning. Deriving and implementing a VB algorithm is a
tedious error-prone process, and ensuring its correctness
would be difficult beyond PCFGs without formal semantics.
PRISM augmented with VB will completely eliminate such
needs and make it easy to explore and test new Bayesian
models by helping the user cope with data sparseness and
avoid over-fitting
CHR(PRISM)-based Probabilistic Logic Learning
PRISM is an extension of Prolog with probabilistic predicates and built-in
support for expectation-maximization learning. Constraint Handling Rules (CHR)
is a high-level programming language based on multi-headed multiset rewrite
rules.
In this paper, we introduce a new probabilistic logic formalism, called
CHRiSM, based on a combination of CHR and PRISM. It can be used for high-level
rapid prototyping of complex statistical models by means of "chance rules". The
underlying PRISM system can then be used for several probabilistic inference
tasks, including probability computation and parameter learning. We define the
CHRiSM language in terms of syntax and operational semantics, and illustrate it
with examples. We define the notion of ambiguous programs and define a
distribution semantics for unambiguous programs. Next, we describe an
implementation of CHRiSM, based on CHR(PRISM). We discuss the relation between
CHRiSM and other probabilistic logic programming languages, in particular PCHR.
Finally we identify potential application domains
- …