8,592 research outputs found

    Complexity of Non-Monotonic Logics

    Full text link
    Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter's default logic, Moore's autoepistemic logic and McCarthy's circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.Comment: To appear in Bulletin of the EATC

    The Complexity of Reasoning for Fragments of Default Logic

    Get PDF
    Default logic was introduced by Reiter in 1980. In 1992, Gottlob classified the complexity of the extension existence problem for propositional default logic as \SigmaPtwo-complete, and the complexity of the credulous and skeptical reasoning problem as SigmaP2-complete, resp. PiP2-complete. Additionally, he investigated restrictions on the default rules, i.e., semi-normal default rules. Selman made in 1992 a similar approach with disjunction-free and unary default rules. In this paper we systematically restrict the set of allowed propositional connectives. We give a complete complexity classification for all sets of Boolean functions in the meaning of Post's lattice for all three common decision problems for propositional default logic. We show that the complexity is a hexachotomy (SigmaP2-, DeltaP2-, NP-, P-, NL-complete, trivial) for the extension existence problem, while for the credulous and skeptical reasoning problem we obtain similar classifications without trivial cases.Comment: Corrected versio

    Boolean Operations, Joins, and the Extended Low Hierarchy

    Get PDF
    We prove that the join of two sets may actually fall into a lower level of the extended low hierarchy than either of the sets. In particular, there exist sets that are not in the second level of the extended low hierarchy, EL_2, yet their join is in EL_2. That is, in terms of extended lowness, the join operator can lower complexity. Since in a strong intuitive sense the join does not lower complexity, our result suggests that the extended low hierarchy is unnatural as a complexity measure. We also study the closure properties of EL_ and prove that EL_2 is not closed under certain Boolean operations. To this end, we establish the first known (and optimal) EL_2 lower bounds for certain notions generalizing Selman's P-selectivity, which may be regarded as an interesting result in its own right.Comment: 12 page
    • …
    corecore