189 research outputs found

    Encoding Markov Logic Networks in Possibilistic Logic

    Get PDF
    Markov logic uses weighted formulas to compactly encode a probability distribution over possible worlds. Despite the use of logical formulas, Markov logic networks (MLNs) can be difficult to interpret, due to the often counter-intuitive meaning of their weights. To address this issue, we propose a method to construct a possibilistic logic theory that exactly captures what can be derived from a given MLN using maximum a posteriori (MAP) inference. Unfortunately, the size of this theory is exponential in general. We therefore also propose two methods which can derive compact theories that still capture MAP inference, but only for specific types of evidence. These theories can be used, among others, to make explicit the hidden assumptions underlying an MLN or to explain the predictions it makes.Comment: Extended version of a paper appearing in UAI 201

    Belief Revision with Uncertain Inputs in the Possibilistic Setting

    Full text link
    This paper discusses belief revision under uncertain inputs in the framework of possibility theory. Revision can be based on two possible definitions of the conditioning operation, one based on min operator which requires a purely ordinal scale only, and another based on product, for which a richer structure is needed, and which is a particular case of Dempster's rule of conditioning. Besides, revision under uncertain inputs can be understood in two different ways depending on whether the input is viewed, or not, as a constraint to enforce. Moreover, it is shown that M.A. Williams' transmutations, originally defined in the setting of Spohn's functions, can be captured in this framework, as well as Boutilier's natural revision.Comment: Appears in Proceedings of the Twelfth Conference on Uncertainty in Artificial Intelligence (UAI1996

    Syntactic Computation of Hybrid Possibilistic Conditioning under Uncertain Inputs

    Get PDF
    International audienceWe extend hybrid possibilistic conditioning to deal with inputs consisting of a set of triplets composed of propositional formulas, the level at which the formulas should be accepted, and the way in which their models should be revised. We characterize such conditioning using elementary operations on possibility distributions. We then solve a difficult issue that concerns the syntactic computation of the revision of possibilistic knowledge bases, made of weighted formulas, using hybrid conditioning. An important result is that there is no extra computational cost in using hybrid possibilistic conditioning and in particular the size of the revised possibilistic base is polynomial with respect to the size of the initial base and the input
    • …
    corecore