135 research outputs found

    Congruence types

    Get PDF

    Proviola: A Tool for Proof Re-animation

    Full text link
    To improve on existing models of interaction with a proof assistant (PA), in particular for storage and replay of proofs, we in- troduce three related concepts, those of: a proof movie, consisting of frames which record both user input and the corresponding PA response; a camera, which films a user's interactive session with a PA as a movie; and a proviola, which replays a movie frame-by-frame to a third party. In this paper we describe the movie data structure and we discuss a proto- type implementation of the camera and proviola based on the ProofWeb system. ProofWeb uncouples the interaction with a PA via a web- interface (the client) from the actual PA that resides on the server. Our camera films a movie by "listening" to the ProofWeb communication. The first reason for developing movies is to uncouple the reviewing of a formal proof from the PA used to develop it: the movie concept enables users to discuss small code fragments without the need to install the PA or to load a whole library into it. Other advantages include the possibility to develop a separate com- mentary track to discuss or explain the PA interaction. We assert that a combined camera+proviola provides a generic layer between a client (user) and a server (PA). Finally we claim that movies are the right type of data to be stored in an encyclopedia of formalized mathematics, based on our experience in filming the Coq standard library.Comment: Accepted for the 9th International Conference on Mathematical Knowledge Management (MKM 2010), 15 page

    A type system for Continuation Calculus

    Get PDF
    Continuation Calculus (CC), introduced by Geron and Geuvers, is a simple foundational model for functional computation. It is closely related to lambda calculus and term rewriting, but it has no variable binding and no pattern matching. It is Turing complete and evaluation is deterministic. Notions like "call-by-value" and "call-by-name" computation are available by choosing appropriate function definitions: e.g. there is a call-by-value and a call-by-name addition function. In the present paper we extend CC with types, to be able to define data types in a canonical way, and functions over these data types, defined by iteration. Data type definitions follow the so-called "Scott encoding" of data, as opposed to the more familiar "Church encoding". The iteration scheme comes in two flavors: a call-by-value and a call-by-name iteration scheme. The call-by-value variant is a double negation variant of call-by-name iteration. The double negation translation allows to move between call-by-name and call-by-value.Comment: In Proceedings CL&C 2014, arXiv:1409.259

    The logic and mathematics of occasion sentences

    Get PDF
    The prime purpose of this paper is, first, to restore to discourse-bound occasion sentences their rightful central place in semantics and secondly, taking these as the basic propositional elements in the logical analysis of language, to contribute to the development of an adequate logic of occasion sentences and a mathematical (Boolean) foundation for such a logic, thus preparing the ground for more adequate semantic, logical and mathematical foundations of the study of natural language. Some of the insights elaborated in this paper have appeared in the literature over the past thirty years, and a number of new developments have resulted from them. The present paper aims atproviding an integrated conceptual basis for this new development in semantics. In Section 1 it is argued that the reduction by translation of occasion sentences to eternal sentences, as proposed by Russell and Quine, is semantically and thus logically inadequate. Natural language is a system of occasion sentences, eternal sentences being merely boundary cases. The logic hasfewer tasks than is standardly assumed, as it excludes semantic calculi, which depend crucially on information supplied by cognition and context and thus belong to cognitive psychology rather than to logic. For sentences to express a proposition and thus be interpretable and informative, they must first be properly anchored in context. A proposition has a truth value when it is, moreover, properly keyed in the world, i.e. is about a situation in the world. Section 2 deals with the logical properties of natural language. It argues that presuppositional phenomena require trivalence and presents the trivalent logic PPC3, with two kinds of falsity and two negations. It introduces the notion of Σ-space for a sentence A (or A/A, the set of situations in which A is true) as the basis of logical model theory, and the notion of PA/ (the Σ-space of the presuppositions of A), functioning as a `private' subuniverse for A/A. The trivalent Kleene calculus is reinterpreted as a logical account of vagueness, rather than of presupposition. PPC3 and the Kleene calculus are refinements of standard bivalent logic and can be combined into one logical system. In Section 3 the adequacy of PPC3 as a truth-functional model of presupposition is considered more closely and given a Boolean foundation. In a noncompositional extended Boolean algebra, three operators are defined: 1a for the conjoined presuppositions of a, ã for the complement of a within 1a, and â for the complement of 1a within Boolean 1. The logical properties of this extended Boolean algebra are axiomatically defined and proved for all possible models. Proofs are provided of the consistency and the completeness of the system. Section 4 is a provisional exploration of the possibility of using the results obtained for a new discourse-dependent account of the logic of modalities in natural language. The overall result is a modified and refined logical and model-theoretic machinery, which takes into account both the discourse-dependency of natural language sentences and the necessity of selecting a key in the world before a truth value can be assigne

    Verifiable certificates for predicate subtyping

    Get PDF
    Adding predicate subtyping to higher-order logic yields a very expressive language in which type-checking is undecidable, making the definition of a system of verifiable certificates challenging. This work presents a solution to this issue with a minimal formalization of predicate subtyping, named PVS-Core, together with a system of verifiable certificates for PVS-Core, named PVS-Cert. PVS-Cert is based on the introduction of proof terms and explicit coercions. Its design is similar to that of PTSs with dependent pairs, at the exception of the definition of conversion, which is based on a specific notion of reduction → β * , corresponding to β-reduction combined with the erasure of coercions. The use of this reduction instead of the more standard reduction → βσ allows to establish a simple correspondence between PVS-Core and PVS-Cert. On the other hand, a type-checking algorithm is designed for PVS-Cert, built on proofs of type preservation of → βσ and strong normalization of both → βσ and → β *. Using these results, PVS-Cert judgements are used as verifiable certificates for predicate subtyping. In addition, the reduction → βσ is used to define a cut elimination procedure adapted to predicate subtyping. Its use to study the properties of predicate subtyp-ing is illustrated with a proof of consistency

    From coinductive proofs to exact real arithmetic: theory and applications

    Full text link
    Based on a new coinductive characterization of continuous functions we extract certified programs for exact real number computation from constructive proofs. The extracted programs construct and combine exact real number algorithms with respect to the binary signed digit representation of real numbers. The data type corresponding to the coinductive definition of continuous functions consists of finitely branching non-wellfounded trees describing when the algorithm writes and reads digits. We discuss several examples including the extraction of programs for polynomials up to degree two and the definite integral of continuous maps

    Taylor Series Revisited

    Get PDF
    We propose a renovated approach around the use of Taylor expansions to provide polynomial approximations. We introduce a coinductive type scheme and finely-tuned operations that altogether constitute an algebra, where our multivariate Taylor expansions are first-class objects. As for applications, beyond providing classical expansions of integro-differential and algebraic expressions mixed with elementary functions, we demonstrate that solving ODE and PDE in a direct way, without external solvers, is also possible. We also discuss the possibility of computing certified errors within our scheme

    Bicategories in univalent foundations

    Get PDF
    We develop bicategory theory in univalent foundations. Guided by the notion of univalence for (1-)categories studied by Ahrens, Kapulkin, and Shulman, we define and study univalent bicategories. To construct examples of those, we develop the notion of "displayed bicategories", an analog of displayed 1-categories introduced by Ahrens and Lumsdaine. Displayed bicategories allow us to construct univalent bicategories in a modular fashion. To demonstrate the applicability of this notion, we prove several bicategories are univalent. Among these are the bicategory of univalent categories with families and the bicategory of pseudofunctors between univalent bicategories. Our work is formalized in the UniMath library of univalent mathematics

    A Focused Sequent Calculus Framework for Proof Search in Pure Type Systems

    Get PDF
    Basic proof-search tactics in logic and type theory can be seen as the root-first applications of rules in an appropriate sequent calculus, preferably without the redundancies generated by permutation of rules. This paper addresses the issues of defining such sequent calculi for Pure Type Systems (PTS, which were originally presented in natural deduction style) and then organizing their rules for effective proof-search. We introduce the idea of Pure Type Sequent Calculus with meta-variables (PTSCalpha), by enriching the syntax of a permutation-free sequent calculus for propositional logic due to Herbelin, which is strongly related to natural deduction and already well adapted to proof-search. The operational semantics is adapted from Herbelin's and is defined by a system of local rewrite rules as in cut-elimination, using explicit substitutions. We prove confluence for this system. Restricting our attention to PTSC, a type system for the ground terms of this system, we obtain the Subject Reduction property and show that each PTSC is logically equivalent to its corresponding PTS, and the former is strongly normalising iff the latter is. We show how to make the logical rules of PTSC into a syntax-directed system PS for proof-search, by incorporating the conversion rules as in syntax-directed presentations of the PTS rules for type-checking. Finally, we consider how to use the explicitly scoped meta-variables of PTSCalpha to represent partial proof-terms, and use them to analyse interactive proof construction. This sets up a framework PE in which we are able to study proof-search strategies, type inhabitant enumeration and (higher-order) unification

    Certified Exact Transcendental Real Number Computation in Coq

    Get PDF
    Reasoning about real number expressions in a proof assistant is challenging. Several problems in theorem proving can be solved by using exact real number computation. I have implemented a library for reasoning and computing with complete metric spaces in the Coq proof assistant and used this library to build a constructive real number implementation including elementary real number functions and proofs of correctness. Using this library, I have created a tactic that automatically proves strict inequalities over closed elementary real number expressions by computation.Comment: This paper is to be part of the proceedings of the 21st International Conference on Theorem Proving in Higher Order Logics (TPHOLs 2008
    corecore