123,943 research outputs found

    An encompassing framework for Paraconsistent Logic Programs

    Get PDF
    AbstractWe propose a framework which extends Antitonic Logic Programs [Damásio and Pereira, in: Proc. 6th Int. Conf. on Logic Programming and Nonmonotonic Reasoning, Springer, 2001, p. 748] to an arbitrary complete bilattice of truth-values, where belief and doubt are explicitly represented. Inspired by Ginsberg and Fitting's bilattice approaches, this framework allows a precise definition of important operators found in logic programming, such as explicit and default negation. In particular, it leads to a natural semantical integration of explicit and default negation through the Coherence Principle [Pereira and Alferes, in: European Conference on Artificial Intelligence, 1992, p. 102], according to which explicit negation entails default negation. We then define Coherent Answer Sets, and the Paraconsistent Well-founded Model semantics, generalizing many paraconsistent semantics for logic programs. In particular, Paraconsistent Well-Founded Semantics with eXplicit negation (WFSXp) [Alferes et al., J. Automated Reas. 14 (1) (1995) 93–147; Damásio, PhD thesis, 1996]. The framework is an extension of Antitonic Logic Programs for most cases, and is general enough to capture Probabilistic Deductive Databases, Possibilistic Logic Programming, Hybrid Probabilistic Logic Programs, and Fuzzy Logic Programming. Thus, we have a powerful mathematical formalism for dealing simultaneously with default, paraconsistency, and uncertainty reasoning. Results are provided about how our semantical framework deals with inconsistent information and with its propagation by the rules of the program

    The complexity of reasoning for fragments of default logic

    Get PDF
    Default logic was introduced by Reiter in 1980. In 1992, Gottlob classified the complexity of the extension existence problem for propositional default logic as Σ -complete, and the complexity of the credulous and skeptical reasoning problem as Σ -complete, respectively Π -complete. Additionally, he investigated restrictions on the default rules, i.e. semi-normal default rules. Selman used in 1992 a similar approach with disjunction-free and unary default rules. In this article, we systematically restrict the set of allowed propositional connectives. We give a complete complexity classification for all sets of Boolean functions in the meaning of Post's lattice for all three common decision problems for propositional default logic. We show that the complexity is a hexachotomy (Σ -, Δ -, NP-, P-, NL-complete, trivial) for the extension existence problem, while for the credulous and skeptical reasoning problem we obtain similar classifications without trivial cases

    Reasoning by Cases in Structured Argumentation

    Full text link
    We extend the ASPIC+ASPIC^+ framework for structured argumentation so as to allow applications of the reasoning by cases inference scheme for defeasible arguments. Given an argument with conclusion `AA or BB', an argument based on AA with conclusion CC, and an argument based on BB with conclusion CC, we allow the construction of an argument with conclusion CC. We show how our framework leads to different results than other approaches in non-monotonic logic for dealing with disjunctive information, such as disjunctive default theory or approaches based on the OR-rule (which allows to derive a defeasible rule `If (AA or BB) then CC', given two defeasible rules `If AA then CC' and `If BB then CC'). We raise new questions regarding the subtleties of reasoning defeasibly with disjunctive information, and show that its formalization is more intricate than one would presume.Comment: Proceedings of SAC/KRR 201

    Complexity of Non-Monotonic Logics

    Full text link
    Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter's default logic, Moore's autoepistemic logic and McCarthy's circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.Comment: To appear in Bulletin of the EATC

    Complexity of Prioritized Default Logics

    Full text link
    In default reasoning, usually not all possible ways of resolving conflicts between default rules are acceptable. Criteria expressing acceptable ways of resolving the conflicts may be hardwired in the inference mechanism, for example specificity in inheritance reasoning can be handled this way, or they may be given abstractly as an ordering on the default rules. In this article we investigate formalizations of the latter approach in Reiter's default logic. Our goal is to analyze and compare the computational properties of three such formalizations in terms of their computational complexity: the prioritized default logics of Baader and Hollunder, and Brewka, and a prioritized default logic that is based on lexicographic comparison. The analysis locates the propositional variants of these logics on the second and third levels of the polynomial hierarchy, and identifies the boundary between tractable and intractable inference for restricted classes of prioritized default theories

    The Complexity of Reasoning for Fragments of Default Logic

    Get PDF
    Default logic was introduced by Reiter in 1980. In 1992, Gottlob classified the complexity of the extension existence problem for propositional default logic as \SigmaPtwo-complete, and the complexity of the credulous and skeptical reasoning problem as SigmaP2-complete, resp. PiP2-complete. Additionally, he investigated restrictions on the default rules, i.e., semi-normal default rules. Selman made in 1992 a similar approach with disjunction-free and unary default rules. In this paper we systematically restrict the set of allowed propositional connectives. We give a complete complexity classification for all sets of Boolean functions in the meaning of Post's lattice for all three common decision problems for propositional default logic. We show that the complexity is a hexachotomy (SigmaP2-, DeltaP2-, NP-, P-, NL-complete, trivial) for the extension existence problem, while for the credulous and skeptical reasoning problem we obtain similar classifications without trivial cases.Comment: Corrected versio

    Nonmonotonic consequences in default domain theory

    Full text link
    Default domain theory is a framework for representing and reasoning about commonsense knowledge. Although this theory is motivated by ideas in Reiter’s work on default logic, it is in some sense a dual framework. We make Reiter’s default extension operator into a constructive method of building models, not theories. Domain theory, which is a well established tool for representing partial information in the semantics of programming languages, is adopted as the basis for constructing partial models. This paper considers some of the laws of nonmonotonic consequence, due to Gabbay and to Kraus, Lehmann, and Magidor, in the light of default domain theory. We remark that in some cases Gabbay’s law of cautious monotony is open to question. We consider an axiomatization of the nonmonotonic consequence relation on prime open sets in the Scott topology – the natural logic – of a domain, which omits this law. We prove a representation theorem showing that such relations are in one to one correspondence with the consequence relations determined by extensions in Scott domains augmented with default sets. This means that defaults are very expressive: they can, in a sense, represent any reasonable nonmonotonic entailment. Results about what kind of defaults determine cautious monotony are also discussed. In particular, we show that the property of unique extensions guarantees cautious monotony, and we give several classes of default structures which determine unique extensions.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41772/1/10472_2004_Article_325432.pd
    corecore