14 research outputs found

    Automatic Proof Checking and Proof Construction by Tactics

    Full text link
    In this note we compare two kinds of systems that verify the correctness of mathematical developments: roof checking and proof construction by tactics and we propose to merge them in a single system

    A Focused Sequent Calculus Framework for Proof Search in Pure Type Systems

    Get PDF
    Basic proof-search tactics in logic and type theory can be seen as the root-first applications of rules in an appropriate sequent calculus, preferably without the redundancies generated by permutation of rules. This paper addresses the issues of defining such sequent calculi for Pure Type Systems (PTS, which were originally presented in natural deduction style) and then organizing their rules for effective proof-search. We introduce the idea of Pure Type Sequent Calculus with meta-variables (PTSCalpha), by enriching the syntax of a permutation-free sequent calculus for propositional logic due to Herbelin, which is strongly related to natural deduction and already well adapted to proof-search. The operational semantics is adapted from Herbelin's and is defined by a system of local rewrite rules as in cut-elimination, using explicit substitutions. We prove confluence for this system. Restricting our attention to PTSC, a type system for the ground terms of this system, we obtain the Subject Reduction property and show that each PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising iff the latter is. We show how to make the logical rules of PTSC into a syntax-directed system PS for proof-search, by incorporating the conversion rules as in syntax-directed presentations of the PTS rules for type-checking. Finally, we consider how to use the explicitly scoped meta-variables of PTSCalpha to represent partial proof-terms, and use them to analyse interactive proof construction. This sets up a framework PE in which we are able to study proof-search strategies, type inhabitant enumeration and (higher-order) unification

    Teaching Foundations of Computation and Deduction Through Literate Functional Programming and Type Theory Formalization

    Get PDF
    We describe experiments in teaching fundamental informatics notions around mathematical structures for formal concepts, and effective algorithms to manipulate them. The major themes of lambda-calculus and type theory served as guides for the effective implementation of functional programming languages and higher-order proof assistants, appropriate for reflecting the theoretical material into effective tools to represent constructively the concepts and formally certify the proofs of their properties. Progressively, a literate programming and proving style replaced informal mathematics in the presentation of the material as executable course notes. The talk will evoke the various stages of (in)completion of the corresponding set of notes along the years, and tell how their elaboration proved to be essential to the discovery of fundamental results

    Some Lambda Calculus and Type Theory Formalized

    Get PDF
    "This paper is about our hobby." That is the first sentence of [MP93], the first report on our formal development of lambda calculus and type theory, written in autumn 1992. We have continued to pursue this hobby on and off ever since, and have developed a substantial body of formal knowledge, including Church-Rosser and standardizationtheorems for beta reduction, and the basic theory ofPure Type Systems (PTS) leading to the strengthening theorem and type checking algorithms for PTS. Some of this work is reported in [MP93, vBJMP94, Pol94b, Pol95]. In the present paper we survey this work, including some new proofs, and point out what we feel has been learned about the general issues of formalizing mathematics. On the technical side, we describe an abstract, and simplified, proof of standardization for beta reduction, not previously published, that doesnot mention redex positions or residuals. On the general issues, we emphasize the search for formal definitions that are convenient for formal proof and convincingly represent the intended informal concepts. The LEGO Proof Development System [LP92] was used to check the work in an implementation of the Extended Calculus of Constructions(ECC) with inductive types [Luo94]. LEGO is a refinement styleproof checker, publicly available by ftp and WWW, with a User's Manual [LP92] and a large collection of examples. Section 1.3 contains information on accessing the formal development described in this paper. Other interesting examples formalized in LEGO include program specification and data refinement [Luo91], strong normalization of System F [Alt93], synthetic domain theory [Reu95, Reu96], and operational semantics for imperative programs [Sch97]

    The Theory of LEGO

    Get PDF
    LEGO is a computer program for interactive typechecking in the Extended Calculus of Constructions and two of its subsystems. LEGO also supports the extension of these three systems with inductive types. These type systems can be viewed as logics, and as meta languages for expressing logics, and LEGO is intended to be used for interactively constructing proofs in mathematical theories presented in these logics. I have developed LEGO over six years, starting from an implementation of the Calculus of Constructions by Gérard Huet. LEGO has been used for problems at the limits of our abilities to do formal mathematics. In this thesis I explain some aspects of the meta-theory of LEGO's type systems leading to a machine-checked proof that typechecking is decidable for all three type theories supported by LEGO, and to a verified algorithm for deciding their typing judgements, assuming only that they are normalizing. In order to do this, the theory of Pure Type Systems (PTS) is extended and formalized in LEGO. This extended example of a formally developed body of mathematics is described, both for its main theorems, and as a case study in formal mathematics. In many examples, I compare formal definitions and theorems with their informal counterparts, and with various alternative approaches, to study the meaning and use of mathematical language, and suggest clarifications in the informal usage. Having outlined a formal development far too large to be surveyed in detail by a human reader, I close with some thoughts on how the human mathematician's state of understanding and belief might be affected by possessing such a thing

    Practical implementation of a dependently typed functional programming language

    Get PDF
    Types express a program's meaning, and checking types ensures that a program has the intended meaning. In a dependently typed programming language types are predicated on values, leading to the possibility of expressing invariants of a program's behaviour in its type. Dependent types allow us to give more detailed meanings to programs, and hence be more confident of their correctness. This thesis considers the practical implementation of a dependently typed programming language, using the Epigram notation defined by McBride and McKinna. Epigram is a high level notation for dependently typed functional programming elaborating to a core type theory based on Lu๙s UTT, using Dybjer's inductive families and elimination rules to implement pattern matching. This gives us a rich framework for reasoning about programs. However, a naive implementation introduces several run-time overheads since the type system blurs the distinction between types and values; these overheads include the duplication of values, and the storage of redundant information and explicit proofs. A practical implementation of any programming language should be as efficient as possible; in this thesis we see how the apparent efficiency problems of dependently typed programming can be overcome and that in many cases the richer type information allows us to apply optimisations which are not directly available in traditional languages. I introduce three storage optimisations on inductive families; forcing, detagging and collapsing. I further introduce a compilation scheme from the core type theory to G-machine code, including a pattern matching compiler for elimination rules and a compilation scheme for efficient run-time implementation of Peano's natural numbers. We also see some low level optimisations for removal of identity functions, unused arguments and impossible case branches. As a result, we see that a dependent type theory is an effective base on which to build a feasible programming language

    Type checking and normalisation

    Get PDF
    This thesis is about Martin-Löf's intuitionistic theory of types (type theory). Type theory is at the same time a formal system for mathematical proof and a dependently typed programming language. Dependent types are types which depend on data and therefore to type check dependently typed programming we need to perform computation(normalisation) in types. Implementations of type theory (usually some kind of automatic theorem prover or interpreter) have at their heart a type checker. Implementations of type checkers for type theory have at their heart a normaliser. In this thesis I consider type checking as it might form the basis of an implementation of type theory in the functional language Haskell and then normalisation in the more rigorous setting of the dependently typed languages Epigram and Agda. I investigate a method of proving normalisation called Big-Step Normalisation (BSN). I apply BSN to a number of calculi of increasing sophistication and provide machine checked proofs of meta theoretic properties

    The Coq Proof Assistant : Reference Manual : Version 7.2

    Get PDF
    Coq is a proof assistant based on a higher-order logic. Coq allows to handle calculus mathematical assertions and to check mechanically proofs of these assertions. It helps to find formal proofs, and allows extraction of a certified program from the constructive proof of its formal specification. This document is the reference manual for the version V7.2 of Coq which is available from http://coq.inria.fr
    corecore