33 research outputs found

    On a notion of abduction and relevance for first-order logic clause sets

    Get PDF
    I propose techniques to help with explaining entailment and non-entailment in first-order logic respectively relying on deductive and abductive reasoning. First, given an unsatisfiable clause set, one could ask which clauses are necessary for any possible deduction (\emph{syntactically relevant}), usable for some deduction (\emph{syntactically semi-relevant}), or unusable (\emph{syntactically irrelevant}). I propose a first-order formalization of this notion and demonstrate a lifting of this notion to the explanation of an entailment w.r.t some axiom set defined in some description logic fragments. Moreover, it is accompanied by a semantic characterization via \emph{conflict literals} (contradictory simple facts). From an unsatisfiable clause set, a pair of conflict literals are always deducible. A \emph{relevant} clause is necessary to derive any conflict literal, a \emph{semi-relevant} clause is necessary to derive some conflict literal, and an \emph{irrelevant} clause is not useful in deriving any conflict literals. It helps provide a picture of why an explanation holds beyond what one can get from the predominant notion of a minimal unsatisfiable set. The need to test if a clause is (syntactically) semi-relevant leads to a generalization of a well-known resolution strategy: resolution equipped with the set-of-support strategy is refutationally complete on a clause set NN and SOS MM if and only if there is a resolution refutation from NMN\cup M using a clause in MM. This result non-trivially improves the original formulation. Second, abductive reasoning helps find extensions of a knowledge base to obtain an entailment of some missing consequence (called observation). Not only that it is useful to repair incomplete knowledge bases but also to explain a possibly unexpected observation. I particularly focus on TBox abduction in \EL description logic (still first-order logic fragment via some model-preserving translation scheme) which is rather lightweight but prevalent in practice. The solution space can be huge or even infinite. So, different kinds of minimality notions can help sort the chaff from the grain. I argue that existing ones are insufficient, and introduce \emph{connection minimality}. This criterion offers an interpretation of Occam's razor in which hypotheses are accepted only when they help acquire the entailment without arbitrarily using axioms unrelated to the problem at hand. In addition, I provide a first-order technique to compute the connection-minimal hypotheses in a sound and complete way. The key technique relies on prime implicates. While the negation of a single prime implicate can already serve as a first-order hypothesis, a connection-minimal hypothesis which follows \EL syntactic restrictions (a set of simple concept inclusions) would require a combination of them. Termination by bounding the term depth in the prime implicates is provable by only looking into the ones that are also subset-minimal. I also present an evaluation on ontologies from the medical domain by implementing a prototype with SPASS as a prime implicate generation engine.Ich schlage Techniken vor, die bei der Erklärung von Folgerung und Nichtfolgerung in der Logik erster Ordnung helfen, die sich jeweils auf deduktives und abduktives Denken stützen. Erstens könnte man bei einer gegebenen unerfüllbaren Klauselmenge fragen, welche Klauseln für eine mögliche Deduktion notwendig (\emph{syntaktisch relevant}), für eine Deduktion verwendbar (\emph{syntaktisch semi-relevant}) oder unbrauchbar (\emph{syntaktisch irrelevant}). Ich schlage eine Formalisierung erster Ordnung dieses Begriffs vor und demonstriere eine Anhebung dieses Begriffs auf die Erklärung einer Folgerung bezüglich einer Reihe von Axiomen, die in einigen Beschreibungslogikfragmenten definiert sind. Außerdem wird sie von einer semantischen Charakterisierung durch \emph{Konfliktliteral} (widersprüchliche einfache Fakten) begleitet. Aus einer unerfüllbaren Klauselmenge ist immer ein Konfliktliteralpaar ableitbar. Eine \emph{relevant}-Klausel ist notwendig, um ein Konfliktliteral abzuleiten, eine \emph{semi-relevant}-Klausel ist notwendig, um ein Konfliktliteral zu generieren, und eine \emph{irrelevant}-Klausel ist nicht nützlich, um Konfliktliterale zu generieren. Es hilft, ein Bild davon zu vermitteln, warum eine Erklärung über das hinausgeht, was man aus der vorherrschenden Vorstellung einer minimalen unerfüllbaren Menge erhalten kann. Die Notwendigkeit zu testen, ob eine Klausel (syntaktisch) semi-relevant ist, führt zu einer Verallgemeinerung einer bekannten Resolutionsstrategie: Die mit der Set-of-Support-Strategie ausgestattete Resolution ist auf einer Klauselmenge NN und SOS MM widerlegungsvollständig, genau dann wenn es eine Auflösungswiderlegung von NMN\cup M unter Verwendung einer Klausel in MM gibt. Dieses Ergebnis verbessert die ursprüngliche Formulierung nicht trivial. Zweitens hilft abduktives Denken dabei, Erweiterungen einer Wissensbasis zu finden, um eine implikantion einer fehlenden Konsequenz (Beobachtung genannt) zu erhalten. Es ist nicht nur nützlich, unvollständige Wissensbasen zu reparieren, sondern auch, um eine möglicherweise unerwartete Beobachtung zu erklären. Ich konzentriere mich besonders auf die TBox-Abduktion in dem leichten, aber praktisch vorherrschenden Fragment der Beschreibungslogik \EL, das tatsächlich ein Logikfragment erster Ordnung ist (mittels eines modellerhaltenden Übersetzungsschemas). Der Lösungsraum kann riesig oder sogar unendlich sein. So können verschiedene Arten von Minimalitätsvorstellungen helfen, die Spreu vom Weizen zu trennen. Ich behaupte, dass die bestehenden unzureichend sind, und führe \emph{Verbindungsminimalität} ein. Dieses Kriterium bietet eine Interpretation von Ockhams Rasiermesser, bei der Hypothesen nur dann akzeptiert werden, wenn sie helfen, die Konsequenz zu erlangen, ohne willkürliche Axiome zu verwenden, die nichts mit dem vorliegenden Problem zu tun haben. Außerdem stelle ich eine Technik in Logik erster Ordnung zur Berechnung der verbindungsminimalen Hypothesen in zur Verfügung korrekte und vollständige Weise. Die Schlüsseltechnik beruht auf Primimplikanten. Während die Negation eines einzelnen Primimplikant bereits als Hypothese in Logik erster Ordnung dienen kann, würde eine Hypothese des Verbindungsminimums, die den syntaktischen Einschränkungen von \EL folgt (einer Menge einfacher Konzeptinklusionen), eine Kombination dieser beiden erfordern. Die Terminierung durch Begrenzung der Termtiefe in den Primimplikanten ist beweisbar, indem nur diejenigen betrachtet werden, die auch teilmengenminimal sind. Außerdem stelle ich eine Auswertung zu Ontologien aus der Medizin vor, Domäne durch die Implementierung eines Prototyps mit SPASS als Primimplikant-Generierungs-Engine

    Worst-case Optimal Query Answering for Greedy Sets of Existential Rules and Their Subclasses

    Full text link
    The need for an ontological layer on top of data, associated with advanced reasoning mechanisms able to exploit the semantics encoded in ontologies, has been acknowledged both in the database and knowledge representation communities. We focus in this paper on the ontological query answering problem, which consists of querying data while taking ontological knowledge into account. More specifically, we establish complexities of the conjunctive query entailment problem for classes of existential rules (also called tuple-generating dependencies, Datalog+/- rules, or forall-exists-rules. Our contribution is twofold. First, we introduce the class of greedy bounded-treewidth sets (gbts) of rules, which covers guarded rules, and their most well-known generalizations. We provide a generic algorithm for query entailment under gbts, which is worst-case optimal for combined complexity with or without bounded predicate arity, as well as for data complexity and query complexity. Secondly, we classify several gbts classes, whose complexity was unknown, with respect to combined complexity (with both unbounded and bounded predicate arity) and data complexity to obtain a comprehensive picture of the complexity of existential rule fragments that are based on diverse guardedness notions. Upper bounds are provided by showing that the proposed algorithm is optimal for all of them

    A Lightweight Defeasible Description Logic in Depth: Quantification in Rational Reasoning and Beyond

    Get PDF
    Description Logics (DLs) are increasingly successful knowledge representation formalisms, useful for any application requiring implicit derivation of knowledge from explicitly known facts. A prominent example domain benefiting from these formalisms since the 1990s is the biomedical field. This area contributes an intangible amount of facts and relations between low- and high-level concepts such as the constitution of cells or interactions between studied illnesses, their symptoms and remedies. DLs are well-suited for handling large formal knowledge repositories and computing inferable coherences throughout such data, relying on their well-founded first-order semantics. In particular, DLs of reduced expressivity have proven a tremendous worth for handling large ontologies due to their computational tractability. In spite of these assets and prevailing influence, classical DLs are not well-suited to adequately model some of the most intuitive forms of reasoning. The capability for abductive reasoning is imperative for any field subjected to incomplete knowledge and the motivation to complete it with typical expectations. When such default expectations receive contradicting evidence, an abductive formalism is able to retract previously drawn, conflicting conclusions. Common examples often include human reasoning or a default characterisation of properties in biology, such as the normal arrangement of organs in the human body. Treatment of such defeasible knowledge must be aware of exceptional cases - such as a human suffering from the congenital condition situs inversus - and therefore accommodate for the ability to retract defeasible conclusions in a non-monotonic fashion. Specifically tailored non-monotonic semantics have been continuously investigated for DLs in the past 30 years. A particularly promising approach, is rooted in the research by Kraus, Lehmann and Magidor for preferential (propositional) logics and Rational Closure (RC). The biggest advantages of RC are its well-behaviour in terms of formal inference postulates and the efficient computation of defeasible entailments, by relying on a tractable reduction to classical reasoning in the underlying formalism. A major contribution of this work is a reorganisation of the core of this reasoning method, into an abstract framework formalisation. This framework is then easily instantiated to provide the reduction method for RC in DLs as well as more advanced closure operators, such as Relevant or Lexicographic Closure. In spite of their practical aptitude, we discovered that all reduction approaches fail to provide any defeasible conclusions for elements that only occur in the relational neighbourhood of the inspected elements. More explicitly, a distinguishing advantage of DLs over propositional logic is the capability to model binary relations and describe aspects of a related concept in terms of existential and universal quantification. Previous approaches to RC (and more advanced closures) are not able to derive typical behaviour for the concepts that occur within such quantification. The main contribution of this work is to introduce stronger semantics for the lightweight DL EL_bot with the capability to infer the expected entailments, while maintaining a close relation to the reduction method. We achieve this by introducing a new kind of first-order interpretation that allocates defeasible information on its elements directly. This allows to compare the level of typicality of such interpretations in terms of defeasible information satisfied at elements in the relational neighbourhood. A typicality preference relation then provides the means to single out those sets of models with maximal typicality. Based on this notion, we introduce two types of nested rational semantics, a sceptical and a selective variant, each capable of deriving the missing entailments under RC for arbitrarily nested quantified concepts. As a proof of versatility for our new semantics, we also show that the stronger Relevant Closure, can be imbued with typical information in the successors of binary relations. An extensive investigation into the computational complexity of our new semantics shows that the sceptical nested variant comes at considerable additional effort, while the selective semantics reside in the complexity of classical reasoning in the underlying DL, which remains tractable in our case

    Chord sequence patterns in OWL

    Get PDF
    This thesis addresses the representation of, and reasoning on, musical knowledge in the Semantic Web. The Semantic Web is an evolving extension of the World Wide Web that aims at describing information that is distributed on the web in a machine-processable form. Existing approaches to modelling musical knowledge in the context of the Semantic Web have focused on metadata. The description of musical content and reasoning as well as integration of content descriptions and metadata are yet open challenges. This thesis discusses the possibilities of representing musical knowledge in the Web Ontology Language (OWL) focusing on chord sequence representation and presents and evaluates a newly developed solution. The solution consists of two main components. Ontological modelling patterns for musical entities such as notes and chords are introduced in the (MEO) ontology. A sequence pattern language and ontology (SEQ) has been developed that can express patterns in a form resembling regular expressions. As MEO and SEQ patterns both rewrite to OWL they can be combined freely. Reasoning tasks such as instance classification, retrieval and pattern subsumption are then executable by standard Semantic Web reasoners. The expressiveness of SEQ has been studied, in particular in relation to grammars. The complexity of reasoning on SEQ patterns has been studied theoretically and empirically, and optimisation methods have been developed. There is still great potential for improvement if specific reasoning algorithms were developed to exploit the sequential structure, but the development of such algorithms is outside the scope of this thesis. MEO and SEQ have also been evaluated in several musicological scenarios. It is shown how patterns that are characteristic of musical styles can be expressed and chord sequence data can be classified, demonstrating the use of the language in web retrieval and as integration layer for different chord patterns and corpora. Furthermore, possibilities of using SEQ patterns for harmonic analysis are explored using grammars for harmony; both a hybrid system and a translation of limited context-free grammars into SEQ patterns have been developed. Finally, a distributed scenario is evaluated where SEQ and MEO are used in connection with DBpedia, following the Linked Data approach. The results show that applications are already possible and will benefit in the future from improved quality and compatibility of data sources as the Semantic Web evolves.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Completing the Is-a Structure in Description Logics Ontologies

    Full text link

    Foundations of Fuzzy Logic and Semantic Web Languages

    Get PDF
    This book is the first to combine coverage of fuzzy logic and Semantic Web languages. It provides in-depth insight into fuzzy Semantic Web languages for non-fuzzy set theory and fuzzy logic experts. It also helps researchers of non-Semantic Web languages get a better understanding of the theoretical fundamentals of Semantic Web languages. The first part of the book covers all the theoretical and logical aspects of classical (two-valued) Semantic Web languages. The second part explains how to generalize these languages to cope with fuzzy set theory and fuzzy logic

    Foundations of Fuzzy Logic and Semantic Web Languages

    Get PDF
    This book is the first to combine coverage of fuzzy logic and Semantic Web languages. It provides in-depth insight into fuzzy Semantic Web languages for non-fuzzy set theory and fuzzy logic experts. It also helps researchers of non-Semantic Web languages get a better understanding of the theoretical fundamentals of Semantic Web languages. The first part of the book covers all the theoretical and logical aspects of classical (two-valued) Semantic Web languages. The second part explains how to generalize these languages to cope with fuzzy set theory and fuzzy logic
    corecore