19 research outputs found
Probabilistic Default Reasoning with Conditional Constraints
We propose a combination of probabilistic reasoning from conditional
constraints with approaches to default reasoning from conditional knowledge
bases. In detail, we generalize the notions of Pearl's entailment in system Z,
Lehmann's lexicographic entailment, and Geffner's conditional entailment to
conditional constraints. We give some examples that show that the new notions
of z-, lexicographic, and conditional entailment have similar properties like
their classical counterparts. Moreover, we show that the new notions of z-,
lexicographic, and conditional entailment are proper generalizations of both
their classical counterparts and the classical notion of logical entailment for
conditional constraints.Comment: 8 pages; to appear in Proceedings of the Eighth International
Workshop on Nonmonotonic Reasoning, Special Session on Uncertainty Frameworks
in Nonmonotonic Reasoning, Breckenridge, Colorado, USA, 9-11 April 200
The Good Old Davis-Putnam Procedure Helps Counting Models
As was shown recently, many important AI problems require counting the number
of models of propositional formulas. The problem of counting models of such
formulas is, according to present knowledge, computationally intractable in a
worst case. Based on the Davis-Putnam procedure, we present an algorithm, CDP,
that computes the exact number of models of a propositional CNF or DNF formula
F. Let m and n be the number of clauses and variables of F, respectively, and
let p denote the probability that a literal l of F occurs in a clause C of F,
then the average running time of CDP is shown to be O(nm^d), where
d=-1/log(1-p). The practical performance of CDP has been estimated in a series
of experiments on a wide variety of CNF formulas