136,248 research outputs found

    Hybrid type theory: a quartet in four movements

    Get PDF
    This paper sings a song -a song created by bringing together the work of four great names in the history of logic: Hans Reichenbach, Arthur Prior, Richard Montague, and Leon Henkin. Although the work of the first three of these authors have previously been combined, adding the ideas of Leon Henkin is the addition required to make the combination work at the logical level. But the present paper does not focus on the underlying technicalities (these can be found in Areces, Blackburn, Huertas, and Manzano [to appear]) rather it focusses on the underlying instruments, and the way they work together. We hope the reader will be tempted to sing along

    On Equivalence and Canonical Forms in the LF Type Theory

    Full text link
    Decidability of definitional equality and conversion of terms into canonical form play a central role in the meta-theory of a type-theoretic logical framework. Most studies of definitional equality are based on a confluent, strongly-normalizing notion of reduction. Coquand has considered a different approach, directly proving the correctness of a practical equivalance algorithm based on the shape of terms. Neither approach appears to scale well to richer languages with unit types or subtyping, and neither directly addresses the problem of conversion to canonical. In this paper we present a new, type-directed equivalence algorithm for the LF type theory that overcomes the weaknesses of previous approaches. The algorithm is practical, scales to richer languages, and yields a new notion of canonical form sufficient for adequate encodings of logical systems. The algorithm is proved complete by a Kripke-style logical relations argument similar to that suggested by Coquand. Crucially, both the algorithm itself and the logical relations rely only on the shapes of types, ignoring dependencies on terms.Comment: 41 page

    Guarded operations, refinement and simulation

    Get PDF
    Simulation rules have long been used as an effective computational means to decide refinement relations in state-based formalisms. Here we investigate how they might be amended so as to decide the event-based notion of singleton failures refinement of abstract data types or processes that have operations with a "guarded" interpretation. As the results presented here and found elsewhere in the literature are so sensitive to the details of the definitions used, we have machine-checked our results

    Analytic Tableaux for Simple Type Theory and its First-Order Fragment

    Full text link
    We study simple type theory with primitive equality (STT) and its first-order fragment EFO, which restricts equality and quantification to base types but retains lambda abstraction and higher-order variables. As deductive system we employ a cut-free tableau calculus. We consider completeness, compactness, and existence of countable models. We prove these properties for STT with respect to Henkin models and for EFO with respect to standard models. We also show that the tableau system yields a decision procedure for three EFO fragments

    A Case Study on Logical Relations using Contextual Types

    Full text link
    Proofs by logical relations play a key role to establish rich properties such as normalization or contextual equivalence. They are also challenging to mechanize. In this paper, we describe the completeness proof of algorithmic equality for simply typed lambda-terms by Crary where we reason about logically equivalent terms in the proof environment Beluga. There are three key aspects we rely upon: 1) we encode lambda-terms together with their operational semantics and algorithmic equality using higher-order abstract syntax 2) we directly encode the corresponding logical equivalence of well-typed lambda-terms using recursive types and higher-order functions 3) we exploit Beluga's support for contexts and the equational theory of simultaneous substitutions. This leads to a direct and compact mechanization, demonstrating Beluga's strength at formalizing logical relations proofs.Comment: In Proceedings LFMTP 2015, arXiv:1507.0759

    G\"odel's Notre Dame Course

    Full text link
    This is a companion to a paper by the authors entitled "G\"odel's natural deduction", which presented and made comments about the natural deduction system in G\"odel's unpublished notes for the elementary logic course he gave at the University of Notre Dame in 1939. In that earlier paper, which was itself a companion to a paper that examined the links between some philosophical views ascribed to G\"odel and general proof theory, one can find a brief summary of G\"odel's notes for the Notre Dame course. In order to put the earlier paper in proper perspective, a more complete summary of these interesting notes, with comments concerning them, is given here.Comment: 18 pages. minor additions, arXiv admin note: text overlap with arXiv:1604.0307

    Refinement Types for Logical Frameworks and Their Interpretation as Proof Irrelevance

    Full text link
    Refinement types sharpen systems of simple and dependent types by offering expressive means to more precisely classify well-typed terms. We present a system of refinement types for LF in the style of recent formulations where only canonical forms are well-typed. Both the usual LF rules and the rules for type refinements are bidirectional, leading to a straightforward proof of decidability of typechecking even in the presence of intersection types. Because we insist on canonical forms, structural rules for subtyping can now be derived rather than being assumed as primitive. We illustrate the expressive power of our system with examples and validate its design by demonstrating a precise correspondence with traditional presentations of subtyping. Proof irrelevance provides a mechanism for selectively hiding the identities of terms in type theories. We show that LF refinement types can be interpreted as predicates using proof irrelevance, establishing a uniform relationship between two previously studied concepts in type theory. The interpretation and its correctness proof are surprisingly complex, lending support to the claim that refinement types are a fundamental construct rather than just a convenient surface syntax for certain uses of proof irrelevance

    A Logical Foundation for Environment Classifiers

    Full text link
    Taha and Nielsen have developed a multi-stage calculus {\lambda}{\alpha} with a sound type system using the notion of environment classifiers. They are special identifiers, with which code fragments and variable declarations are annotated, and their scoping mechanism is used to ensure statically that certain code fragments are closed and safely runnable. In this paper, we investigate the Curry-Howard isomorphism for environment classifiers by developing a typed {\lambda}-calculus {\lambda}|>. It corresponds to multi-modal logic that allows quantification by transition variables---a counterpart of classifiers---which range over (possibly empty) sequences of labeled transitions between possible worlds. This interpretation will reduce the "run" construct---which has a special typing rule in {\lambda}{\alpha}---and embedding of closed code into other code fragments of different stages---which would be only realized by the cross-stage persistence operator in {\lambda}{\alpha}---to merely a special case of classifier application. {\lambda}|> enjoys not only basic properties including subject reduction, confluence, and strong normalization but also an important property as a multi-stage calculus: time-ordered normalization of full reduction. Then, we develop a big-step evaluation semantics for an ML-like language based on {\lambda}|> with its type system and prove that the evaluation of a well-typed {\lambda}|> program is properly staged. We also identify a fragment of the language, where erasure evaluation is possible. Finally, we show that the proof system augmented with a classical axiom is sound and complete with respect to a Kripke semantics of the logic
    • ā€¦
    corecore