19 research outputs found
Merging the local and global approaches to probabilistic satisfiability
AbstractThe probabilistic satisfiability problem is to verify the consistency of a set of probability values or intervals for logical propositions. The (tight) probabilistic entailment problem is to find best bounds on the probability of an additional proposition. The local approach to these problems applies rules on small sets of logical sentences and probabilities to tighten given probability intervals. The global approach uses linear programming to find best bounds. We show that merging these approaches is profitable to both: local solutions can be used to find global solutions more quickly through stabilized column generation, and global solutions can be used to confirm or refute the optimality of the local solutions found. As a result, best bounds are found, together with their step-by-step justification
Parameter Learning of Logic Programs for Symbolic-Statistical Modeling
We propose a logical/mathematical framework for statistical parameter
learning of parameterized logic programs, i.e. definite clause programs
containing probabilistic facts with a parameterized distribution. It extends
the traditional least Herbrand model semantics in logic programming to
distribution semantics, possible world semantics with a probability
distribution which is unconditionally applicable to arbitrary logic programs
including ones for HMMs, PCFGs and Bayesian networks. We also propose a new EM
algorithm, the graphical EM algorithm, that runs for a class of parameterized
logic programs representing sequential decision processes where each decision
is exclusive and independent. It runs on a new data structure called support
graphs describing the logical relationship between observations and their
explanations, and learns parameters by computing inside and outside probability
generalized for logic programs. The complexity analysis shows that when
combined with OLDT search for all explanations for observations, the graphical
EM algorithm, despite its generality, has the same time complexity as existing
EM algorithms, i.e. the Baum-Welch algorithm for HMMs, the Inside-Outside
algorithm for PCFGs, and the one for singly connected Bayesian networks that
have been developed independently in each research field. Learning experiments
with PCFGs using two corpora of moderate size indicate that the graphical EM
algorithm can significantly outperform the Inside-Outside algorithm
Exploiting Uncertainty for Querying Inconsistent Description Logics Knowledge Bases
The necessity to manage inconsistency in Description Logics Knowledge
Bases~(KBs) has come to the fore with the increasing importance gained by the
Semantic Web, where information comes from different sources that constantly
change their content and may contain contradictory descriptions when considered
either alone or together. Classical reasoning algorithms do not handle
inconsistent KBs, forcing the debugging of the KB in order to remove the
inconsistency. In this paper, we exploit an existing probabilistic semantics
called DISPONTE to overcome this problem and allow queries also in case of
inconsistent KBs. We implemented our approach in the reasoners TRILL and BUNDLE
and empirically tested the validity of our proposal. Moreover, we formally
compare the presented approach to that of the repair semantics, one of the most
established semantics when considering DL reasoning tasks