131 research outputs found

    Kleene algebra with domain

    Full text link
    We propose Kleene algebra with domain (KAD), an extension of Kleene algebra with two equational axioms for a domain and a codomain operation, respectively. KAD considerably augments the expressiveness of Kleene algebra, in particular for the specification and analysis of state transition systems. We develop the basic calculus, discuss some related theories and present the most important models of KAD. We demonstrate applicability by two examples: First, an algebraic reconstruction of Noethericity and well-foundedness; second, an algebraic reconstruction of propositional Hoare logic.Comment: 40 page

    A Completeness Proof for A Regular Predicate Logic with Undefined Truth Value

    Get PDF
    We provide a sound and complete proof system for an extension of Kleene's ternary logic to predicates. The concept of theory is extended with, for each function symbol, a formula that specifies when the function is defined. The notion of "is defined" is extended to terms and formulas via a straightforward recursive algorithm. The "is defined" formulas are constructed so that they themselves are always defined. The completeness proof relies on the Henkin construction. For each formula, precisely one of the formula, its negation, and the negation of its "is defined" formula is true on the constructed model. Many other ternary logics in the literature can be reduced to ours. Partial functions are ubiquitous in computer science and even in (in)equation solving at schools. Our work was motivated by an attempt to explain, precisely in terms of logic, typical informal methods of reasoning in such applications.Comment: 39 pages, 1 figur

    Expression Refinement

    Get PDF
    This thesis presents a refinement calculus for expressions. The aim of refinement calculi is to make programming a mathematical activity, and thereby improve the correctness of programs. To achieve this, a refinement calculus provides a formal language and a set of rules that allow transformations of the language terms. Using a refinement calculus, to produce a correct program, the programmer writes a possibly non-algorithmic or inefficient term that nevertheless obviously describes the intended program. This term is the specification, and it is transformed into an efficient program by syntactic transformation, using the rules of the refinement calculus. This transformation is refinement

    Calculating correct compilers

    Get PDF
    In this article we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high- level semantics by systematic calculation, with all details of the implementation of the compilers falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language features and their combination, including arithmetic expressions, exceptions, state, various forms of lambda calculi, bounded and unbounded loops, non-determinism, and interrupts. All the calculations in the article have been formalised using the Coq proof assistant, which serves as a convenient interactive tool for developing and verifying the calculations

    Nominal Matching Logic

    Get PDF

    The Entropic Dynamics approach to Quantum Mechanics

    Full text link
    Entropic Dynamics (ED) is a framework in which Quantum Mechanics is derived as an application of entropic methods of inference. In ED the dynamics of the probability distribution is driven by entropy subject to constraints that are codified into a quantity later identified as the phase of the wave function. The central challenge is to specify how those constraints are themselves updated. In this paper we review and extend the ED framework in several directions. A new version of ED is introduced in which particles follow smooth differentiable Brownian trajectories (as opposed to non-differentiable Brownian paths). To construct the ED we make use of the fact that the space of probabilities and phases has a natural symplectic structure (i.e., it is a phase space with Hamiltonian flows and Poisson brackets). Then, using an argument based on information geometry, a metric structure is introduced. It is shown that the ED that preserves the symplectic and metric structures -- which is a Hamilton-Killing flow in phase space -- is the linear Schr\"odinger equation. These developments allow us to discuss why wave functions are complex and the connections between the superposition principle, the single-valuedness of wave functions, and the quantization of electric charges. Finally, it is observed that Hilbert spaces are not necessary ingredients in this construction. They are a clever but merely optional trick that turns out to be convenient for practical calculations.Comment: 49 pages. Added a section comparing Entropic Dynamics with Quantum Bayesianism. Added references and corrected typo

    On the mechanisation of the logic of partial functions

    Get PDF
    PhD ThesisIt is well known that partial functions arise frequently in formal reasoning about programs. A partial function may not yield a value for every member of its domain. Terms that apply partial functions thus may not denote, and coping with such terms is problematic in two-valued classical logic. A question is raised: how can reasoning about logical formulae that can contain references to terms that may fail to denote (partial terms) be conducted formally? Over the years a number of approaches to coping with partial terms have been documented. Some of these approaches attempt to stay within the realm of two-valued classical logic, while others are based on non-classical logics. However, as yet there is no consensus on which approach is the best one to use. A comparison of numerous approaches to coping with partial terms is presented based upon formal semantic definitions. One approach to coping with partial terms that has received attention over the years is the Logic of Partial Functions (LPF), which is the logic underlying the Vienna Development Method. LPF is a non-classical three-valued logic designed to cope with partial terms, where both terms and propositions may fail to denote. As opposed to using concrete undfined values, undefinedness is treated as a \gap", that is, the absence of a defined value. LPF is based upon Strong Kleene logic, where the interpretations of the logical operators are extended to cope with truth value \gaps". Over the years a large body of research and engineering has gone into the development of proof based tool support for two-valued classical logic. This has created a major obstacle that affects the adoption of LPF, since such proof support cannot be carried over directly to LPF. Presently, there is a lack of direct proof support for LPF. An aim of this work is to investigate the applicability of mechanised (automated) proof support for reasoning about logical formulae that can contain references to partial terms in LPF. The focus of the investigation is on the basic but fundamental two-valued classical logic proof procedure: resolution and the associated technique proof by contradiction. Advanced proof techniques are built on the foundation that is provided by these basic fundamental proof techniques. Looking at the impact of these basic fundamental proof techniques in LPF is thus the essential and obvious starting point for investigating proof support for LPF. The work highlights the issues that arise when applying these basic techniques in LPF, and investigates the extent of the modifications needed to carry them over to LPF. This work provides the essential foundation on which to facilitate research into the modification of advanced proof techniques for LPF.EPSR

    Formalizing a hierarchical file system

    Get PDF
    An abstract file system is defined here as a partial function from (absolute) paths to data. Such a file system determines the set of valid paths. It allows the file system to be read and written at a valid path, and it allows the system to be modified by the Unix operations for creation, removal, and moving of files and directories. We present abstract definitions (axioms) for these operations. This specification is refined towards a pointer implementation. The challenge is to have a natural abstraction function from the implementation to the specification, to define operations on the concrete store that behave exactly in the same way as the corresponding functions on the abstract store, and to prove these facts. To mitigate the problems attached to partial functions, we do this in two steps: first a refinement towards a pointer implementation with total functions, followed by one that allows partial functions. These two refinements are proved correct by means of a number of invariants. Indeed, the insights gained consist, on the one hand, of the invariants of the pointer implementation that are needed for the refinement functions, and on the other hand of the precise enabling conditions of the operations on the different levels of abstraction. Each of the three specification levels is enriched with a permission system for reading, writing, or executing, and the refinement relations between these permission systems are explored. Files and directories are distinguished from the outset, but this rarely affects our part of the specifications. All results have been verified with the proof assistant PVS, in particular, that the invariants are preserved by the operations, and that, where the invariants hold, the operations commute with the refinement functions
    corecore