40 research outputs found

    On the Concept of a Notational Variant

    Get PDF
    In the study of modal and nonclassical logics, translations have frequently been employed as a way of measuring the inferential capabilities of a logic. It is sometimes claimed that two logics are “notational variants” if they are translationally equivalent. However, we will show that this cannot be quite right, since first-order logic and propositional logic are translationally equivalent. Others have claimed that for two logics to be notational variants, they must at least be compositionally intertranslatable. The definition of compositionality these accounts use, however, is too strong, as the standard translation from modal logic to first-order logic is not compositional in this sense. In light of this, we will explore a weaker version of this notion that we will call schematicity and show that there is no schematic translation either from first-order logic to propositional logic or from intuitionistic logic to classical logic

    Complexity of Non-Monotonic Logics

    Full text link
    Over the past few decades, non-monotonic reasoning has developed to be one of the most important topics in computational logic and artificial intelligence. Different ways to introduce non-monotonic aspects to classical logic have been considered, e.g., extension with default rules, extension with modal belief operators, or modification of the semantics. In this survey we consider a logical formalism from each of the above possibilities, namely Reiter's default logic, Moore's autoepistemic logic and McCarthy's circumscription. Additionally, we consider abduction, where one is not interested in inferences from a given knowledge base but in computing possible explanations for an observation with respect to a given knowledge base. Complexity results for different reasoning tasks for propositional variants of these logics have been studied already in the nineties. In recent years, however, a renewed interest in complexity issues can be observed. One current focal approach is to consider parameterized problems and identify reasonable parameters that allow for FPT algorithms. In another approach, the emphasis lies on identifying fragments, i.e., restriction of the logical language, that allow more efficient algorithms for the most important reasoning tasks. In this survey we focus on this second aspect. We describe complexity results for fragments of logical languages obtained by either restricting the allowed set of operators (e.g., forbidding negations one might consider only monotone formulae) or by considering only formulae in conjunctive normal form but with generalized clause types. The algorithmic problems we consider are suitable variants of satisfiability and implication in each of the logics, but also counting problems, where one is not only interested in the existence of certain objects (e.g., models of a formula) but asks for their number.Comment: To appear in Bulletin of the EATC

    Where Fail-Safe Default Logics Fail

    Full text link
    Reiter's original definition of default logic allows for the application of a default that contradicts a previously applied one. We call failure this condition. The possibility of generating failures has been in the past considered as a semantical problem, and variants have been proposed to solve it. We show that it is instead a computational feature that is needed to encode some domains into default logic

    A Polynomial Translation of Logic Programs with Nested Expressions into Disjunctive Logic Programs: Preliminary Report

    Full text link
    Nested logic programs have recently been introduced in order to allow for arbitrarily nested formulas in the heads and the bodies of logic program rules under the answer sets semantics. Nested expressions can be formed using conjunction, disjunction, as well as the negation as failure operator in an unrestricted fashion. This provides a very flexible and compact framework for knowledge representation and reasoning. Previous results show that nested logic programs can be transformed into standard (unnested) disjunctive logic programs in an elementary way, applying the negation as failure operator to body literals only. This is of great practical relevance since it allows us to evaluate nested logic programs by means of off-the-shelf disjunctive logic programming systems, like DLV. However, it turns out that this straightforward transformation results in an exponential blow-up in the worst-case, despite the fact that complexity results indicate that there is a polynomial translation among both formalisms. In this paper, we take up this challenge and provide a polynomial translation of logic programs with nested expressions into disjunctive logic programs. Moreover, we show that this translation is modular and (strongly) faithful. We have implemented both the straightforward as well as our advanced transformation; the resulting compiler serves as a front-end to DLV and is publicly available on the Web.Comment: 10 pages; published in Proceedings of the 9th International Workshop on Non-Monotonic Reasonin

    Reasoning about Minimal Belief and Negation as Failure

    Full text link
    We investigate the problem of reasoning in the propositional fragment of MBNF, the logic of minimal belief and negation as failure introduced by Lifschitz, which can be considered as a unifying framework for several nonmonotonic formalisms, including default logic, autoepistemic logic, circumscription, epistemic queries, and logic programming. We characterize the complexity and provide algorithms for reasoning in propositional MBNF. In particular, we show that entailment in propositional MBNF lies at the third level of the polynomial hierarchy, hence it is harder than reasoning in all the above mentioned propositional formalisms for nonmonotonic reasoning. We also prove the exact correspondence between negation as failure in MBNF and negative introspection in Moore's autoepistemic logic

    The Complexity of Reasoning for Fragments of Default Logic

    Get PDF
    Default logic was introduced by Reiter in 1980. In 1992, Gottlob classified the complexity of the extension existence problem for propositional default logic as \SigmaPtwo-complete, and the complexity of the credulous and skeptical reasoning problem as SigmaP2-complete, resp. PiP2-complete. Additionally, he investigated restrictions on the default rules, i.e., semi-normal default rules. Selman made in 1992 a similar approach with disjunction-free and unary default rules. In this paper we systematically restrict the set of allowed propositional connectives. We give a complete complexity classification for all sets of Boolean functions in the meaning of Post's lattice for all three common decision problems for propositional default logic. We show that the complexity is a hexachotomy (SigmaP2-, DeltaP2-, NP-, P-, NL-complete, trivial) for the extension existence problem, while for the credulous and skeptical reasoning problem we obtain similar classifications without trivial cases.Comment: Corrected versio

    On the Relative Expressiveness of Argumentation Frameworks, Normal Logic Programs and Abstract Dialectical Frameworks

    Full text link
    We analyse the expressiveness of the two-valued semantics of abstract argumentation frameworks, normal logic programs and abstract dialectical frameworks. By expressiveness we mean the ability to encode a desired set of two-valued interpretations over a given propositional signature using only atoms from that signature. While the computational complexity of the two-valued model existence problem for all these languages is (almost) the same, we show that the languages form a neat hierarchy with respect to their expressiveness.Comment: Proceedings of the 15th International Workshop on Non-Monotonic Reasoning (NMR 2014

    On the Existence of Characterization Logics and Fundamental Properties of Argumentation Semantics

    Get PDF
    Given the large variety of existing logical formalisms it is of utmost importance to select the most adequate one for a specific purpose, e.g. for representing the knowledge relevant for a particular application or for using the formalism as a modeling tool for problem solving. Awareness of the nature of a logical formalism, in other words, of its fundamental intrinsic properties, is indispensable and provides the basis of an informed choice. One such intrinsic property of logic-based knowledge representation languages is the context-dependency of pieces of knowledge. In classical propositional logic, for example, there is no such context-dependence: whenever two sets of formulas are equivalent in the sense of having the same models (ordinary equivalence), then they are mutually replaceable in arbitrary contexts (strong equivalence). However, a large number of commonly used formalisms are not like classical logic which leads to a series of interesting developments. It turned out that sometimes, to characterize strong equivalence in formalism L, we can use ordinary equivalence in formalism L0: for example, strong equivalence in normal logic programs under stable models can be characterized by the standard semantics of the logic of here-and-there. Such results about the existence of characterizing logics has rightly been recognized as important for the study of concrete knowledge representation formalisms and raise a fundamental question: Does every formalism have one? In this thesis, we answer this question with a qualified “yes”. More precisely, we show that the important case of considering only finite knowledge bases guarantees the existence of a canonical characterizing formalism. Furthermore, we argue that those characterizing formalisms can be seen as classical, monotonic logics which are uniquely determined (up to isomorphism) regarding their model theory. The other main part of this thesis is devoted to argumentation semantics which play the flagship role in Dung’s abstract argumentation theory. Almost all of them are motivated by an easily understandable intuition of what should be acceptable in the light of conflicts. However, although these intuitions equip us with short and comprehensible formal definitions it turned out that their intrinsic properties such as existence and uniqueness, expressibility, replaceability and verifiability are not that easily accessible. We review the mentioned properties for almost all semantics available in the literature. In doing so we include two main axes: namely first, the distinction between extension-based and labelling-based versions and secondly, the distinction of different kind of argumentation frameworks such as finite or unrestricted ones

    What Can You Say? Measuring the Expressive Power of Languages

    Get PDF
    There are many different ways to talk about the world. Some ways of talking are more expressive than others—that is, they enable us to say more things about the world. But what exactly does this mean? When is one language able to express more about the world than another? In my dissertation, I systematically investigate different ways of answering this question and develop a formal theory of expressive power, translation, and notational variance. In doing so, I show how these investigations help to clarify the role that expressive power plays within debates in metaphysics, logic, and the philosophy of language
    corecore