161 research outputs found

    On Irrelevance and Algorithmic Equality in Predicative Type Theory

    Full text link
    Dependently typed programs contain an excessive amount of static terms which are necessary to please the type checker but irrelevant for computation. To separate static and dynamic code, several static analyses and type systems have been put forward. We consider Pfenning's type theory with irrelevant quantification which is compatible with a type-based notion of equality that respects eta-laws. We extend Pfenning's theory to universes and large eliminations and develop its meta-theory. Subject reduction, normalization and consistency are obtained by a Kripke model over the typed equality judgement. Finally, a type-directed equality algorithm is described whose completeness is proven by a second Kripke model.Comment: 36 pages, superseds the FoSSaCS 2011 paper of the first author, titled "Irrelevance in Type Theory with a Heterogeneous Equality Judgement

    On Equivalence and Canonical Forms in the LF Type Theory

    Full text link
    Decidability of definitional equality and conversion of terms into canonical form play a central role in the meta-theory of a type-theoretic logical framework. Most studies of definitional equality are based on a confluent, strongly-normalizing notion of reduction. Coquand has considered a different approach, directly proving the correctness of a practical equivalance algorithm based on the shape of terms. Neither approach appears to scale well to richer languages with unit types or subtyping, and neither directly addresses the problem of conversion to canonical. In this paper we present a new, type-directed equivalence algorithm for the LF type theory that overcomes the weaknesses of previous approaches. The algorithm is practical, scales to richer languages, and yields a new notion of canonical form sufficient for adequate encodings of logical systems. The algorithm is proved complete by a Kripke-style logical relations argument similar to that suggested by Coquand. Crucially, both the algorithm itself and the logical relations rely only on the shapes of types, ignoring dependencies on terms.Comment: 41 page

    Eta-Equivalence in Core Dependent Haskell

    Get PDF

    Martin-L\"of \`a la Coq

    Full text link
    We present an extensive mechanization of the meta-theory of Martin-L\"of Type Theory (MLTT) in the Coq proof assistant. Our development builds on pre-existing work in Agda to show not only the decidability of conversion, but also the decidability of type checking, using an approach guided by bidirectional type checking. From our proof of decidability, we obtain a certified and executable type checker for a full-fledged version of MLTT with support for Π\Pi, Σ\Sigma, N\mathbb{N}, and identity types, and one universe. Furthermore, our development does not rely on impredicativity, induction-recursion or any axiom beyond MLTT with a schema for indexed inductive types and a handful of predicative universes, narrowing the gap between the object theory and the meta-theory to a mere difference in universes. Finally, we explain our formalization choices, geared towards a modular development relying on Coq's features, e.g. meta-programming facilities provided by tactics and universe polymorphism

    Encoding of Predicate Subtyping with Proof Irrelevance in the ??-Calculus Modulo Theory

    Get PDF
    The ??-calculus modulo theory is a logical framework in which various logics and type systems can be encoded, thus helping the cross-verification and interoperability of proof systems based on those logics and type systems. In this paper, we show how to encode predicate subtyping and proof irrelevance, two important features of the PVS proof assistant. We prove that this encoding is correct and that encoded proofs can be mechanically checked by Dedukti, a type checker for the ??-calculus modulo theory using rewriting

    Coinductive Formal Reasoning in Exact Real Arithmetic

    Full text link
    In this article we present a method for formally proving the correctness of the lazy algorithms for computing homographic and quadratic transformations -- of which field operations are special cases-- on a representation of real numbers by coinductive streams. The algorithms work on coinductive stream of M\"{o}bius maps and form the basis of the Edalat--Potts exact real arithmetic. We use the machinery of the Coq proof assistant for the coinductive types to present the formalisation. The formalised algorithms are only partially productive, i.e., they do not output provably infinite streams for all possible inputs. We show how to deal with this partiality in the presence of syntactic restrictions posed by the constructive type theory of Coq. Furthermore we show that the type theoretic techniques that we develop are compatible with the semantics of the algorithms as continuous maps on real numbers. The resulting Coq formalisation is available for public download.Comment: 40 page

    The Confluent Terminating Context-Free Substitutive Rewriting System for the lambda-Calculus with Surjective Pairing and Terminal Type

    Get PDF
    For the lambda-calculus with surjective pairing and terminal type, Curien and Di Cosmo, inspired by Knuth-Bendix completion, introduced a confluent rewriting system of the naive rewriting system. Their system is a confluent (CR) rewriting system stable under contexts. They left the strong normalization (SN) of their rewriting system open. By Girard\u27s reducibility method with restricting reducibility theorem, we prove SN of their rewriting, and SN of the extensions by polymorphism and (terminal types caused by parametric polymorphism). We extend their system by sum types and eta-like reductions, and prove the SN. We compare their system to type-directed expansions

    A Two-Level Linear Dependent Type Theory

    Full text link
    We present a type theory combining both linearity and dependency by stratifying typing rules into a level for logics and a level for programs. The distinction between logics and programs decouples their semantics, allowing the type system to assume tight resource bounds. A natural notion of irrelevancy is established where all proofs and types occurring inside programs are fully erasable without compromising their operational behavior. Through a heap-based operational semantics, we show that extracted programs always make computational progress and run memory clean. Additionally, programs can be freely reflected into the logical level for conducting deep proofs in the style of standard dependent type theories. This enables one to write resource safe programs and verify their correctness using a unified language
    • …
    corecore