15 research outputs found

    Disproving in First-Order Logic with Definitions, Arithmetic and Finite Domains

    Get PDF
    This thesis explores several methods which enable a first-order reasoner to conclude satisfiability of a formula modulo an arithmetic theory. The most general method requires restricting certain quantifiers to range over finite sets; such assumptions are common in the software verification setting. In addition, the use of first-order reasoning allows for an implicit representation of those finite sets, which can avoid scalability problems that affect other quantified reasoning methods. These new techniques form a useful complement to existing methods that are primarily aimed at proving validity. The Superposition calculus for hierarchic theory combinations provides a basis for reasoning modulo theories in a first-order setting. The recent account of ‘weak abstraction’ and related improvements make an mplementation of the calculus practical. Also, for several logical theories of interest Superposition is an effective decision procedure for the quantifier free fragment. The first contribution is an implementation of that calculus (Beagle), including an optimized implementation of Cooper’s algorithm for quantifier elimination in the theory of linear integer arithmetic. This includes a novel means of extracting values for quantified variables in satisfiable integer problems. Beagle won an efficiency award at CADE Automated theorem prover System Competition (CASC)-J7, and won the arithmetic non-theorem category at CASC-25. This implementation is the start point for solving the ‘disproving with theories’ problem. Some hypotheses can be disproved by showing that, together with axioms the hypothesis is unsatisfiable. Often this is relative to other axioms that enrich a base theory by defining new functions. In that case, the disproof is contingent on the satisfiability of the enrichment. Satisfiability in this context is undecidable. Instead, general characterizations of definition formulas, which do not alter the satisfiability status of the main axioms, are given. These general criteria apply to recursive definitions, definitions over lists, and to arrays. This allows proving some non-theorems which are otherwise intractable, and justifies similar disproofs of non-linear arithmetic formulas. When the hypothesis is contingently true, disproof requires proving existence of a model. If the Superposition calculus saturates a clause set, then a model exists, but only when the clause set satisfies a completeness criterion. This requires each instance of an uninterpreted, theory-sorted term to have a definition in terms of theory symbols. The second contribution is a procedure that creates such definitions, given that a subset of quantifiers range over finite sets. Definitions are produced in a counter-example driven way via a sequence of over and under approximations to the clause set. Two descriptions of the method are given: the first uses the component solver modularly, but has an inefficient counter-example heuristic. The second is more general, correcting many of the inefficiencies of the first, yet it requires tracking clauses through a proof. This latter method is shown to apply also to lists and to problems with unbounded quantifiers. Together, these tools give new ways for applying successful first-order reasoning methods to problems involving interpreted theories

    Elements of computability, decidability, and complexity (Third edition)

    Get PDF
    These lecture notes are intended to introduce the reader to the basic notions of computability theory, decidability, and complexity. More information on these subjects can be found in classical books such as [Cut80,Dav58,Her69,HoU79,Rog67]. The results reported in these notes are taken from those books and in various parts we closely follow their style of presentation. The reader is encouraged to look at those books for improving his/her knowledge on these topics. Some parts of the chapter on complexity are taken from the lecture notes of a beautiful course given by Prof. Leslie Valiant at Edinburgh University, Scotland, in 1979. It was, indeed, a very stimulating and enjoyable course. For the notions of Predicate Calculus we have used in this book the reader may refer to [Men87]. I would like to thank Dr. Maurizio Proietti at IASI-CNR (Roma, Italy), my colleagues, and my students at the University of Roma Tor Vergata and, in particular, Michele Martone. They have been for me a source of continuous inspiration and enthusiasm. Finally, I would like to thank Dr. Gioacchino Onorati and Lorenzo Costantini of the Aracne Publishing Company for their helpful cooperation

    Proceedings of Sixth International Workshop on Unification

    Full text link
    Swiss National Science Foundation; Austrian Federal Ministry of Science and Research; Deutsche Forschungsgemeinschaft (SFB 314); Christ Church, Oxford; Oxford University Computing Laborator

    Elements of computability, decidability, and complexity (Third edition)

    Get PDF
    These lecture notes are intended to introduce the reader to the basic notions of computability theory, decidability, and complexity. More information on these subjects can be found in classical books such as [Cut80,Dav58,Her69,HoU79,Rog67]. The results reported in these notes are taken from those books and in various parts we closely follow their style of presentation. The reader is encouraged to look at those books for improving his/her knowledge on these topics. Some parts of the chapter on complexity are taken from the lecture notes of a beautiful course given by Prof. Leslie Valiant at Edinburgh University, Scotland, in 1979. It was, indeed, a very stimulating and enjoyable course. For the notions of Predicate Calculus we have used in this book the reader may refer to [Men87]. I would like to thank Dr. Maurizio Proietti at IASI-CNR (Roma, Italy), my colleagues, and my students at the University of Roma Tor Vergata and, in particular, Michele Martone. They have been for me a source of continuous inspiration and enthusiasm. Finally, I would like to thank Dr. Gioacchino Onorati and Lorenzo Costantini of the Aracne Publishing Company for their helpful cooperation

    Pseudo-contractions as Gentle Repairs

    Get PDF
    Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas

    Planning for behaviour-based robotic assembly: a logical framework

    Get PDF

    An approximation and refinement approach to first-order automated reasoning

    Get PDF
    With the goal of lifting model-based guidance from the propositional setting to first order logic, I have developed an approximation theorem proving approach based on counterexample-guided abstraction refinement. A given clause set is transformed into a simplified form where satisfiability is decidable. This approximation extends the signature and preserves unsatisfiability: if the simplified clause set is satisfiable, so is the original clause set. A resolution refutation generated by a decision procedure on the simplified clause set can then either be lifted to a refutation in the original clause set, or it guides a refinement excluding the previously found unliftable refutation. This way the approach is refutationally complete. The monadic shallow linear Horn fragment, which is the initial target of the approximation, is well-known to be decidable. It was a long standing open problem how to extend the fragment to the non-Horn case, preserving decidability, that would, e.g., enable to express non-determinism in protocols. I have now proven decidability of the non-Horn monadic shallow linear fragment via ordered resolution. I further extend the clause language with a new type of constraints, called straight dismatching constraints. The extended clause language is motivated by an improved refinement of the approximation-refinement framework. All needed operations on straight dismatching constraints take linear or linear logarithmic time in the size of the constraint. Ordered resolution with straight dismatching constraints is sound and complete and the monadic shallow linear fragment with straight dismatching constraints is decidable. I have implemented my approach based on the SPASS theorem prover. On certain satisfiable problems, the implementation shows the ability to beat established provers such as SPASS, iProver, and Vampire.Mit dem Ziel die Modell-basierten Methoden der Aussagenlogik auf die Logik erster Stufe anzuwenden habe ich ein approximations Beweis-System entwickelt, das auf der Idee der ’Gegenbeispiel-gelenkten Abstraktions-Verfeinerung’ beruht. Eine gegebene Klausel-Menge wird zunächst in eine vereinfachte Form übersetzt, in der die Erfüllbarkeit entscheidbar ist. Diese sogenannte Approximation erweitert die Signatur, aber erhält Unerfüllbarkeit: Falls die approximierte Klauseln erfüllbar sind, so ist es auch die ursprüngliche Menge. Ein Resolutions-Beweis, der von einer Entscheidungs-Prozedur auf der Approximation erzeugt wurde, kann dann entweder als Basis eines Unerfüllbarkeits Beweises der ursprünglichen Menge dienen oder aber eine Verfeinerung der Approximation aufzeigen, welche den gefundenen Beweis davon ausschließt noch einmal gefunden zu werden. Damit ist der Ansatz widerlegungs vollständig. Das monadisch flache lineare Horn Fragment, das als anfängliches Ziel der Approximation dient, ist bereits seit längerem als entscheidbar bekannt. Es war ein lange offenes Problem, wie man das Fragment auf den nicht-Horn Fall erweitern kann ohne Entscheidbarkeit zu verlieren. Damit lassen sich unter anderem nicht-deterministische Protokolle ausdrücken. Ich habe nun die Entscheidbarkeit des nicht-Horn monadisch flachen linearen Fragments mittels geordneter Resolution bewiesen. Zusätzlich habe ich die Klausel-Sprache durch eine neue Art von Constraints erweitert, die ich als ’straight dismatching constraints’ bezeichne. Diese Erweiterung ist dadurch motiviert dass sie eine Verbesserung der Approximations-Verfeinerung des vorgestellten Systems erlaubt. Alle benötigten Operationen auf diesen Constraints nehmen lediglich lineare oder linear-logarithmische Zeit und Platz in Anspruch. Ich zeige, dass geordnete Resolution mit Constraints korrekt und vollständig ist und dass das monadische flache lineare Fragment mit Constraints entscheidbar ist. Ich habe meinen Ansatz auf dem Theorem-Beweiser SPASS basierend implementiert. Auf bestimmten erfüllbaren Problem schlägt meine Implementierung sogar etablierte Beweiser wie SPASS, iProver und Vampire

    Programming with Specifications

    Get PDF
    This thesis explores the use of specifications for the construction of correct programs. We go beyond their standard use as run-time assertions, and present algorithms, techniques and implementations for the tasks of 1) program verification, 2) declarative programming and 3) software synthesis. These results are made possible by our advances in the domains of decision procedure design and implementation. In the first part of this thesis, we present a decidability result for a class of logics that support user-defined recursive function definitions. Constraints in this class can encode expressive properties of recursive data structures, such as sortedness of a list, or balancing of a search tree. As a result, complex verification conditions can be stated concisely and solved entirely automatically. We also present a new decision procedure for a logic to reason about sets and constraints over their cardinalities. The key insight lies in a technique to decompose con- straints according to mutual dependencies. Compared to previous techniques, our algorithm brings significant improvements in running times, and for the first time integrates reasoning about cardinalities within the popular DPLL(T ) setting. We integrated our algorithmic ad- vances into Leon, a static analyzer for functional programs. Leon can reason about constraints involving arbitrary recursive function definitions, and has the desirable theoretical property that it will always find counter-examples to assertions that do not hold. We illustrate the flexibility and efficiency of Leon through experimental evaluation, where we used it to prove detailed correctness properties of data structure implementations. We then illustrate how program specifications can be used as a high-level programming construct ; we present Kaplan, an extension of Scala with first-class logical constraints. Kaplan allows programmers to create, manipulate and combine constraints as they would any other data structure. Our implementation of Kaplan illustrates how declarative programming can be incorporated into an existing mainstream programming language. Moreover, we examine techniques to transform, at compile-time, program specifications into efficient executable code. This approach of software synthesis combines the correctness benefits of declarative programming with the efficiency of imperative or functional programming

    Labelled superposition

    Get PDF
    The work presented in this thesis consists of two parts: The first part presents a formalization of the splitting rule for case analysis in superposition and a proof of its correctness, as well as the integration into splitting of a novel approach to lemma learning, which allows the derivation of non-ground lemmas. The second part deals with the application of hierarchic superposition, and in particular superposition modulo linear arithmetic SUP(LA), to the verification of timed and probabilistic timed systems. It contains the following three main contributions: Firstly, a SUP(LA) decision procedure for reachability in timed automata, which is among the first decision procedures for free first-order logic modulo linear arithmetic; secondly, an automatic induction rule for SUP(LA) based on loop detection, whose application allows a generalization of the decidability result for timed automata to timed automata with unbounded integer variables; and thirdly, a formalism for modelling probabilistic timed systems with first-order background theories, as well as a novel approach for the analysis of such systems based on a reduction to model checking using SUP(LA) proof enumeration.Diese Arbeit besteht aus zwei Teilen: Im ersten Teil wird die Splitting-Regel zur Fallunterscheidung im Superpositionskalkül formalisiert und die Korrektheit der Formalisierung bewiesen. Ausserdem wird die Splitting-Regel mit einem neuartigen Verfahren zum Lernen von Lemmata erweitert, welches das Lernen von nicht-grund Lemmata erlaubt. Der zweite Teil befasst sich mit der Anwendung des hierarchischen Superpositionskalküls, insbesondere von Superposition modulo linearer Arithmetik SUP(LA), zur Verifikation von Echtzeit- und probabilistischen Echtzeitsystemen. Die drei wichtigsten Beiträge in diesem Teil sind: Erstens, ein SUP(LA)-basiertes Entscheidungsverfahren für Timed Automata, welches zu den ersten Entscheidungsverfahren für die hierarchische Kombination der freien Logik erster Stufe mit linear Arithmetik gehört; zweitens, eine Regel zur automatischen Induktion in SUP(LA), die auf der Erkennung von Schleifen basiert, und dank derer das Entscheidbarkeitsresultat für Timed Automata hin zu Timed Automata mit unbeschränkten Integer-Variablen verallgemeinert wird; und drittens, ein Formalismus zur Modellierung probabilistischer Echtzeitsysteme mit Hintergrundtheorien erster Stufe, sowie ein neuartiges Verfahren zur Analyse ebensolcher Systeme, welches auf einer Aufzählung von Erreichbarkeitsbeweisen in SUP(LA) sowie einer Zurückführung auf das Model Checking-Verfahren basiert
    corecore