99 research outputs found
On the mechanisation of the logic of partial functions
PhD ThesisIt is well known that partial functions arise frequently in formal reasoning
about programs. A partial function may not yield a value for every member
of its domain. Terms that apply partial functions thus may not denote, and
coping with such terms is problematic in two-valued classical logic. A question
is raised: how can reasoning about logical formulae that can contain references
to terms that may fail to denote (partial terms) be conducted formally? Over
the years a number of approaches to coping with partial terms have been
documented. Some of these approaches attempt to stay within the realm
of two-valued classical logic, while others are based on non-classical logics.
However, as yet there is no consensus on which approach is the best one to
use. A comparison of numerous approaches to coping with partial terms is
presented based upon formal semantic definitions.
One approach to coping with partial terms that has received attention over
the years is the Logic of Partial Functions (LPF), which is the logic underlying
the Vienna Development Method. LPF is a non-classical three-valued logic
designed to cope with partial terms, where both terms and propositions may
fail to denote. As opposed to using concrete undfined values, undefinedness
is treated as a \gap", that is, the absence of a defined value. LPF is based
upon Strong Kleene logic, where the interpretations of the logical operators
are extended to cope with truth value \gaps".
Over the years a large body of research and engineering has gone into the
development of proof based tool support for two-valued classical logic. This
has created a major obstacle that affects the adoption of LPF, since such proof
support cannot be carried over directly to LPF. Presently, there is a lack of
direct proof support for LPF.
An aim of this work is to investigate the applicability of mechanised (automated)
proof support for reasoning about logical formulae that can contain
references to partial terms in LPF. The focus of the investigation is on the basic
but fundamental two-valued classical logic proof procedure: resolution and
the associated technique proof by contradiction. Advanced proof techniques
are built on the foundation that is provided by these basic fundamental proof
techniques. Looking at the impact of these basic fundamental proof techniques
in LPF is thus the essential and obvious starting point for investigating proof
support for LPF. The work highlights the issues that arise when applying
these basic techniques in LPF, and investigates the extent of the modifications needed to carry them over to LPF. This work provides the essential foundation
on which to facilitate research into the modification of advanced proof
techniques for LPF.EPSR
Proving Well-Definedness of JML Specifications with KeY
Specification methods in formal program verification enable the enhancement of source code with formal annotations as to formally specify the behaviour of a program. This is a popular way in order to subsequently prove software to be
reliable and meet certain requirements, which is crucial for many applications and gains even more importance in modern society. The annotations can be taken as a contract, which then can be verified guaranteeing the specified program element – as a receiver – to fulfil this contract with its caller. However, these functional contracts can be problematic for partial functions, e.g., a division, as certain cases may be undefined, as in this example a division by zero. Modern programming languages such as Java handle undefined behaviour by casting an exception. There are several approaches to handle a potential undefinedness of specifications. In this thesis, we chose one which automatically generates formal proof obligations ensuring that undefined specification expressions will not be evaluated. Within this work, we elaborate on so-called Well-Definedness Checks dealing with undefinedness occurring in specifications of the modelling language JML/JML* in the KeY System, which is a formal software development tool providing mechanisms to deductively prove the before mentioned contracts. Advantages and delimitations are discussed and, furthermore, precise definitions as well as a fully functional implementation within KeY are given. Our work covers the major part of the specification elements currently supported by KeY, on the higher level including class invariants, model fields, method contracts, loop statements and block contracts. The process of checking the
well-definedness of a specification forms a preliminary step before the actual proof and rejects undefined specifications. We further contribute by giving a choice between two different semantics, both bearing different
advantages and disadvantages. The thesis also includes an extensive case study analysing many examples and measuring the performance of the implemented Well-Definedness Checks
The Logic of the RAISE Specification Language
This paper describes the logic of the RAISE Specification Language, RSL. It explains the particular logic chosen for RAISE, and motivates this choice as suitable for a wide spectrum language to be used for designs as well as initial specifications, and supporting imperative and concurrent specifications as well as applicative sequential ones. It also describes the logical definition of RSL, its axiomatic semantics, as well as the proof system for carrying out proofs
A simple sequent calculus for partial functions
AbstractUsually, the extension of classical logic to a three-level valued logic results in a complicated calculus, with side-conditions on the rules of logic in order to ensure consistency. One reason for the necessity of side-conditions is the presence of nonmonotonic operators. Another reason is the choice of consequence relation. Side-conditions severely violate the symmetry of the logic. By limiting the extension to monotonic cases and by choosing an appropriate consequence relation, a simple calculus for three-valued logic arises. The logic has strong correspondences to ordinary classical logic and, in particular, the symmetry of the Genzen sequent calculus (LK) is preserved, leading to a simple proof for cut elimination
Unifying Theories of Logics with Undefinedness
A relational approach to the question of how different logics relate formally is described. We consider three three-valued logics, as well as classical and semi-classical logic. A fundamental representation of three-valued predicates is developed in the Unifying Theories of Programming (UTP) framework of Hoare and He. On this foundation, the five logics are encoded semantically as UTP theories. Several fundamental relationships are revealed using theory linking mechanisms, which corroborate results found in the literature, and which have direct applicability to the sound mixing of logics in order to prove facts. The initial development of the fundamental three-valued predicate model, on which the theories are based, is then applied to the novel systems-of-systems specification language CML, in order to reveal proof obligations which bridge a gap that exists between the semantics of CML and the existing semantics of one of its sub-languages, VDM. Finally, a detailed account is given of an envisioned model theory for our proposed structuring, which aims to lift the sentences of the five logics encoded to the second order, allowing them to range over elements of existing UTP theories of computation, such as designs and CSP processes. We explain how this would form a complete treatment of logic interplay that is expressed entirely inside UTP
Mechanising an algebraic rely-guarantee refinement calculus
PhD ThesisDespite rely-guarantee (RG) being a well-studied program logic established in the 1980s, it
was not until recently that researchers realised that rely and guarantee conditions could be
treated as independent programming constructs. This recent reformulation of RG paved the
way to algebraic characterisations which have helped to better understand the difficulties that
arise in the practical application of this development approach.
The primary focus of this thesis is to provide automated tool support for a rely-guarantee
refinement calculus proposed by Hayes et. al., where rely and guarantee are defined as
independent commands. Our motivation is to investigate the application of an algebraic
approach to derive concrete examples using this calculus. In the course of this thesis, we
locate and fix a few issues involving the refinement language, its operational semantics and
preexisting proofs. Moreover, we extend the refinement calculus of Hayes et. al. to cover
indexed parallel composition, non-atomic evaluation of expressions within specifications,
and assignment to indexed arrays. These extensions are illustrated via concrete examples.
Special attention is given to design decisions that simplify the application of the mechanised
theory. For example, we leave part of the design of the expression language on the
hands of the user, at the cost of the requiring the user to define the notion of undefinedness
for unary and binary operators; and we also formalise a notion of indexed parallelism that is
parametric on the type of the indexes, this is done deliberately to simplify the formalisation of
algorithms. Additionally, we use stratification to reduce the number of cases in in simulation
proofs involving the operational semantics. Finally, we also use the algebra to discuss the
role of types in program derivation
Proof-checking mathematical texts in controlled natural language
The research conducted for this thesis has been guided by the vision of a computer program that could check the correctness of mathematical proofs written in the language found in mathematical textbooks. Given that reliable processing of unrestricted natural language input is out of the reach of current technology, we focused on the attainable goal of using a controlled natural language (a subset of a natural language defined through a formal grammar) as input language to such a program. We have developed a prototype of such a computer program, the Naproche system. This thesis is centered around the novel logical and linguistic theory needed for defining and motivating the controlled natural language and the proof checking algorithm of the Naproche system. This theory provides means for bridging the wide gap between natural and formal mathematical proofs. We explain how our system makes use of and extends existing linguistic formalisms in order to analyse the peculiarities of the language of mathematics. In this regard, we describe a phenomenon of this language previously not described by other logicians or linguists, the implicit dynamic function introduction, exemplified by constructs of the form "for every x there is an f(x) such that ...". We show how this function introduction can lead to a paradox analogous to Russell's paradox. To tackle this problem, we developed a novel foundational theory of functions called Ackermann-like Function Theory, which is equiconsistent to ZFC (Zermelo-Fraenkel set theory with the Axiom of Choice) and can be used for imposing limitations to implicit dynamic function introduction in order to avoid this paradox. We give a formal account of implicit dynamic function introduction by extending Dynamic Predicate Logic, a formalism developed by linguists to account for the dynamic nature of natural language quantification, to a novel formalism called Higher-Order Dynamic Predicate Logic, whose semantics is based on Ackermann-like Function Theory. Higher-Order Dynamic Predicate Logic also includes a formal account of the linguistic theory of presuppositions, which we use for clarifying and formally modelling the usage of potentially undefined terms (e.g. 1/x, which is undefined for x=0) and of definite descriptions (e.g. "the even prime number") in the language of mathematics. The semantics of the controlled natural language is defined through a translation from the controlled natural language into an extension of Higher-Order Dynamic Predicate Logic called Proof Text Logic. Proof Text Logic extends Higher-Order Dynamic Predicate Logic in two respects, which make it suitable for representing the content of mathematical texts: It contains features for representing complete texts rather than single assertions, and instead of being based on Ackermann-like Function Theory, it is based on a richer foundational theory called Class-Map-Tuple-Number Theory, which does not only have maps/functions, but also classes/sets, tuples, numbers and Booleans as primitives. The proof checking algorithm checks the deductive correctness of proof texts written in the controlled natural language of the Naproche system. Since the semantics of the controlled natural language is defined through a translation into the Proof Text Logic formalism, the proof checking algorithm is defined on Proof Text Logic input. The algorithm makes use of automated theorem provers for checking the correctness of single proof steps. In this way, the proof steps in the input text do not need to be as fine-grained as in formal proof calculi, but may contain several reasoning steps at once, just as is usual in natural mathematical texts. The proof checking algorithm has to recognize implicit dynamic function introductions in the input text and has to take care of presuppositions of mathematical statements according to the principles of the formal account of presuppositions mentioned above. We prove two soundness and two completeness theorems for the proof checking algorithm: In each case one theorem compares the algorithm to the semantics of Proof Text Logic and one theorem compares it to the semantics of standard first-order predicate logic. As a case study for the theory developed in the thesis, we illustrate the working of the Naproche system on a controlled natural language adaptation of the beginning of Edmund Landau's Grundlagen der Analysis.Beweisprüfung mathematischer Texte in kontrollierter natürlicher Sprache Die Forschung, die für diese Dissertation durchgeführt wurde, basiert auf der Vision eines Computerprogramms, das die Korrektheit von mathematischen Beweisen, die in der gewöhnlichen mathematischen Fachsprache verfasst sind, überprüfen kann. Da die zuverlässige automatische Bearbeitung von uneingeschränktem natürlich-sprachlichen Input außer Reichweite der gegenwärtigen Technologie ist, haben wir uns auf das erreichbare Ziel fokussiert, eine kontrollierte natürliche Sprache (eine Teilmenge der natürlichen Sprache, die durch eine formale Grammatik definiert ist) als Eingabesprache für ein solches Programm zu verwenden. Wir haben einen Prototypen eines solchen Computerprogramms, das Naproche-System, entwickelt. Die vorliegende Dissertation beschreibt die neuartigen logischen und linguistischen Theorien, die benötigt werden, um die kontrollierte natürliche Sprache und den Beweisprüfungs-Algorithmus des Naproche-Systems zu definieren und zu motivieren. Diese Theorien stellen Methoden zu Verfügung, die dazu verwendet werden können, die weite Kluft zwischen natürlichen und formalen mathematischen Beweisen zu überbrücken. Wir erklären, wie unser System existierende linguistische Formalismen verwendet und erweitert, um die Besonderheiten der mathematischen Fachsprache zu analysieren. In diesem Zusammenhang beschreiben wir ein Phänomen dieser Fachsprache, das bisher von Logikern und Linguisten nicht beschrieben wurde – die implizite dynamische Funktionseinführung, die durch Konstruktionen der vorm "für jedes x gibt es ein f(x), so dass ..." veranschaulicht werden kann. Wir zeigen, wie diese Funktionseinführung zu einer der Russellschen analogen Antinomie führt. Um dieses Problem zu lösen, haben wir eine neuartige Grundlagentheorie für Funktionen entwickelt, die Ackermann-artige Funktionstheorie, die äquikonsistent zu ZFC (Zermelo-Fraenkel-Mengenlehre mit Auswahlaxiom) ist und verwendet werden kann, um der impliziten dynamischen Funktionseinführung Grenzen zu setzen, die zur Vermeidung dieser Antinomie führen. Wir beschreiben die implizite dynamische Funktionseinführung formal, indem wir die Dynamische Prädikatenlogik – ein Formalismus, der von Linguisten entwickelt wurde, um die dynamischen Eigenschaften der natürlich-sprachlichen Quantifizierung zu erfassen – zur Dynamischen Prädikatenlogik Höherer Stufe erweitern, deren Semantik auf der Ackermann-artigen Funktionstheorie basiert. Die Dynamische Prädikatenlogik Höherer Stufe formalisiert auch die linguistische Theorie der Präsuppositionen, die wir verwenden, um den Gebrauch potentiell undefinierter Terme (z.B. der Term 1/x, der für x=0 undefiniert ist) und bestimmter Kennzeichnungen (z.B. "die gerade Primzahl") in der mathematischen Fachsprache zu modellieren. Die Semantik der kontrollierten natürlichen Sprache wird definiert durch eine Übersetzung dieser in eine Erweiterung der Dynamischen Prädikatenlogik Höherer Stufe mit der Bezeichnung Beweistext-Logik. Die Beweistext-Logik erweitert die Dynamische Prädikatenlogik Höherer Stufe in zwei Hinsichten: Sie stellt Funktionalitäten für die Repräsentation von vollständigen Texten, und nicht nur von Einzelaussagen, zur Verfügung, und anstatt auf der Ackermann-artigen Funktionstheorie zu basieren, basiert sie auf einer reichhaltigeren Grundlagentheorie – der Klassen-Abbildungs-Tupel-Zahlen-Theorie, die neben Abbildungen/Funktionen auch noch Klassen/Mengen, Tupel, Zahlen und boolesche Werte als Grundobjekte zur Verfügung stellt. Der Beweisprüfungs-Algorithmus prüft die deduktive Korrektheit von Beweistexten, die in der kontrollierten natürlichen Sprache des Naproche-Systems verfasst sind. Da die Semantik dieser kontrollierten natürlichen Sprache durch eine Übersetzung in die Beweistext-Logik definiert ist, ist der Beweisprüfungs-Algorithmus für Beweistext-Logik-Input definiert. Der Algorithmus verwendet automatische Beweiser für die Überprüfung einzelner Beweisschritte. Dadurch müssen die Beweisschritte in dem Eingabetext nicht so kleinschrittig sein wie in formalen Beweiskalkülen, sondern können mehrere Deduktionsschritte zu einem Schritt vereinen, so wie dies auch in natürlichen mathematischen Texten üblich ist. Der Beweisprüfungs-Algorithmus muss die impliziten Funktionseinführungen im Eingabetext erkennen und Präsuppositionen von mathematischen Aussagen auf Grundlage der oben erwähnten Präsuppositionstheorie behandeln. Wir beweisen zwei Korrektheits- und zwei Vollständigkeitssätze für den Beweisprüfungs-Algorithmus: Jeweils einer dieser Sätze vergleicht den Algorithmus mit der Semantik der Beweistext-Logik und jeweils einer mit der Semantik der üblichen Prädikatenlogik erster Stufe. Als Fallstudie für die in dieser Dissertation entwickelte Theorie veranschaulichen wir die Funktionsweise des Naproche-Systems an einem an die kontrollierte natürliche Sprache angepassten Anfangsabschnitt von Edmund Landaus Grundlagen der Analysis
CASL for CafeOBJ Users
Casl is an expressive language for the algebraic specificationof software requirements, design, and architecture. It has been developed by an open collaborative effort called CoFI (Common Framework Initiative for algebraic specification and development). Casl combines the best features of many previous main-stream algebraic specification languages, and it should provide a focus for future research and development in the use of algebraic techniques, as well facilitating interoperability ofexisting and future tools. This paper presents Casl for users of the CafeOBJ framework, focusing on the relationship between the two languages. It first considers those constructs of CafeOBJ that have direct counterparts in Casl, and then (briefly) those that do not. It also motivates various Casl constructsthat are not provided by CafeOBJ. Finally, it gives a concise overview of Casl, and illustrates how some CafeOBJ specifications may be expressed in Casl
- …