312 research outputs found
Bayesian network learning with cutting planes
The problem of learning the structure of Bayesian networks from complete
discrete data with a limit on parent set size is considered. Learning is cast
explicitly as an optimisation problem where the goal is to find a BN structure
which maximises log marginal likelihood (BDe score). Integer programming,
specifically the SCIP framework, is used to solve this optimisation problem.
Acyclicity constraints are added to the integer program (IP) during solving in
the form of cutting planes. Finding good cutting planes is the key to the
success of the approach -the search for such cutting planes is effected using a
sub-IP. Results show that this is a particularly fast method for exact BN
learning
First-order integer programming for MAP problems
Finding the most probable (MAP) model in SRL frameworks such as Markov logic
and Problog can, in principle, be solved by encoding the problem as a
`grounded-out' mixed integer program (MIP). However, useful first-order
structure disappears in this process motivating the development of first-order
MIP approaches. Here we present mfoilp, one such approach. Since the syntax and
semantics of mfoilp is essentially the same as existing approaches we focus
here mainly on implementation and algorithmic issues. We start with the
(conceptually) simple problem of using a logic program to generate a MIP
instance before considering more ambitious exploitation of first-order
representations.Comment: corrected typo
Finding Minimal Cost Herbrand Models with Branch-Cut-and-Price
Given (1) a set of clauses in some first-order language and (2)
a cost function , mapping each
ground atom in the Herbrand base to a non-negative real, then
the problem of finding a minimal cost Herbrand model is to either find a
Herbrand model of which is guaranteed to minimise the sum of the
costs of true ground atoms, or establish that there is no Herbrand model for
. A branch-cut-and-price integer programming (IP) approach to solving this
problem is presented. Since the number of ground instantiations of clauses and
the size of the Herbrand base are both infinite in general, we add the
corresponding IP constraints and IP variables `on the fly' via `cutting' and
`pricing' respectively. In the special case of a finite Herbrand base we show
that adding all IP variables and constraints from the outset can be
advantageous, showing that a challenging Markov logic network MAP problem can
be solved in this way if encoded appropriately
Online Causal Structure Learning in the Presence of Latent Variables
We present two online causal structure learning algorithms which can track
changes in a causal structure and process data in a dynamic real-time manner.
Standard causal structure learning algorithms assume that causal structure does
not change during the data collection process, but in real-world scenarios, it
does often change. Therefore, it is inappropriate to handle such changes with
existing batch-learning approaches, and instead, a structure should be learned
in an online manner. The online causal structure learning algorithms we present
here can revise correlation values without reprocessing the entire dataset and
use an existing model to avoid relearning the causal links in the prior model,
which still fit data. Proposed algorithms are tested on synthetic and
real-world datasets, the latter being a seasonally adjusted commodity price
index dataset for the U.S. The online causal structure learning algorithms
outperformed standard FCI by a large margin in learning the changed causal
structure correctly and efficiently when latent variables were present.Comment: 16 pages, 9 figures, 2 table
- β¦