79 research outputs found

    Metaconfluence of Calculi with Explicit Substitutions at a Distance

    Get PDF
    Confluence is a key property of rewriting calculi that guarantees uniqueness of normal-forms when they exist. Metaconfluence is even more general, and guarantees confluence on open/meta terms, i.e. terms with holes, called metavariables that can be filled up with other (open/meta) terms. The difficulty to deal with open terms comes from the fact that the structure of metaterms is only partially known, so that some reduction rules became blocked by the metavariables. In this work, we establish metaconfluence for a family of calculi with explicit substitutions (ES) that enjoy preservation of strong-normalization (PSN) and that act at a distance. For that, we first extend the notion of reduction on metaterms in such a way that explicit substitutions are never structurally moved, i.e. they also act at a distance on metaterms. The resulting reduction relations are still rewriting systems, i.e. they do not include equational axioms, thus providing for the first time an interesting family of lambda-calculi with explicit substitutions that enjoy both PSN and metaconfluence without requiring sophisticated notions of reduction modulo a set of equations

    A calculus of multiary sequent terms

    Get PDF
    Multiary sequent terms were originally introduced as a tool for proving termination of permutative conversions in cut-free sequent calculus. This work develops the language of multiary sequent terms into a term calculus for the computational (Curry-Howard) interpretation of a fragment of sequent calculus with cuts and cut-elimination rules. The system, named generalised multiary lambda-calculus, is a rich extension of the lambda-calculus where the computational content of the sequent calculus format is explained through an enlarged form of the application constructor. Such constructor exhibits the features of multiarity (the ability of forming lists of arguments) and generality (the ability of prescribing a kind of continuation). The system integrates in a modular way the multiary lambda-calculus and an isomorphic copy of the lambda-calculus with generalised application LambdaJ (in particular, natural deduction is captured internally up to isomorphism). In addition, the system: (i) comes with permutative conversion rules, whose role is to eliminate the new features of application; (ii) is equipped with reduction rules --- either the mu-rule, typical of the multiary setting, or rules for cut-elimination, which enlarge the ordinary beta-rule. This paper establishes the meta-theory of the system, with emphasis on the role of the mu-rule, and including a study of the interaction of reduction and permutative conversions.Fundação para a Ciência e a Tecnologia (FCT

    Atomic lambda-calculus:A typed lambda-calculus with explicit sharing

    Get PDF

    A lambda-calculus that achieves full laziness with spine duplication

    Get PDF

    An Intuitionistic Formula Hierarchy Based on High-School Identities

    Get PDF
    We revisit the notion of intuitionistic equivalence and formal proof representations by adopting the view of formulas as exponential polynomials. After observing that most of the invertible proof rules of intuitionistic (minimal) propositional sequent calculi are formula (i.e. sequent) isomorphisms corresponding to the high-school identities, we show that one can obtain a more compact variant of a proof system, consisting of non-invertible proof rules only, and where the invertible proof rules have been replaced by a formula normalisation procedure. Moreover, for certain proof systems such as the G4ip sequent calculus of Vorob'ev, Hudelmaier, and Dyckhoff, it is even possible to see all of the non-invertible proof rules as strict inequalities between exponential polynomials; a careful combinatorial treatment is given in order to establish this fact. Finally, we extend the exponential polynomial analogy to the first-order quantifiers, showing that it gives rise to an intuitionistic hierarchy of formulas, resembling the classical arithmetical hierarchy, and the first one that classifies formulas while preserving isomorphism

    On the Formalisation of the Metatheory of the Lambda Calculus and Languages with Binders

    Get PDF
    Este trabajo trata sobre el razonamiento formal veri cado por computadora involucrando lenguajes con operadores de ligadura. Comenzamos presentando el Cálculo Lambda, para el cual utilizamos la sintaxis histórica, esto es, sintaxis de primer orden con sólo un tipo de nombres para las variables ligadas y libres. Primeramente trabajamos con términos concretos, utilizando la operación de sustitución múltiple de nida por Stoughton como la operación fundamental sobre la cual se de nen las conversiones alfa y beta. Utilizando esta sintaxis desarrollamos los principales resultados metateóricos del cálculo: los lemas de sustitución, el teorema de Church-Rosser y el teorema de preservación de tipo (Subject Reduction) para el sistema de asignación de tipos simples. En una segunda formalización reproducimos los mismos resultados, esta vez basando la conversion alfa sobre una operación más sencilla, que es la de permutación de nombres. Utilizando este mecanismo, derivamos principios de inducción y recursión que permiten trabajar identificando términos alfa equivalentes, de modo tal de reproducir la llamada convención de variables de Barendregt. De este modo, podemos imitar las demostraciones al estilo lápiz y papel dentro del riguroso entorno formal de un asistente de demostración. Como una generalización de este último enfoque, concluimos utilizando técnicas de programación genérica para definir una base para razonar sobre estructuras genéricas con operadores de ligadura. Definimos un universo de tipos de datos regulares con información de variables y operadores de ligadura, y sobre éstos definimos operadores genéricos de formación, eliminación e inducción. También introducimos una relación de alfa equivalencia basada en la operación de permutación y derivamos un principio de iteración/inducción que captura la convención de variables anteriormente mencionada. A modo de ejemplo, mostramos cómo definir el Cálculo Lambda y el sistema F en nuestro universo, ilustrando no sólo la reutilización de las pruebas genéricas, sino también cuán sencillo es el desarrollo de nuevas pruebas en estos casos. Todas las formalizaciones de esta tesis fueron realizadas en Teoría Constructiva de Tipos y verificadas utilizando el asistente de pruebas AgdaThis work is about formal, machine-checked reasoning on languages with name binders. We start by considering the ʎ-calculus using the historical ( rst order) syntax with only one sort of names for both bound and free variables. We rst work on the concrete terms taking Stoughton's multiple substitution operation as the fundamental operation upon which the ά and ß-conversion are de ned. Using this syntax we reach well-known meta-theoretical results, namely the Substitution lemmas, the Church-Rosser theorem and the Subject Reduction theorem for the system of assignment of simple types. In a second formalisation we reproduce the same results, this time using an approach in which -conversion is de ned using the simpler operation of name permutation. Using this we derive induction and recursion principles that allow us to work by identifying terms up to -conversion and to reproduce the so-called Barendregt's variable convention [4]. Thus, we are able to mimic pencil and paper proofs inside the rigorous formal setting of a proof assistant. As a generalisation of the latter, we conclude by using generic programming techniques to de ne a framework for reasoning over generic structures with binders. We de ne a universe of regular datatypes with variables and binders information, and over these we de ne generic formation, elimination, and induction operations. We also introduce an ά equivalence relation based on the swapping operation, and are able to derive an -iteration/induction principle that captures Barendregt's variable convention. As an example, we show how to de ne the ʎ calculus and System F in our universe, and thereby we are able to illustrate not only the reuse of the generic proofs but also how simple the development of new proofs becomes in these instances. All formalisations in this thesis have been made in Constructive Type Theory and completely checked using the Agda proof assistan

    What is the meaning of proofs? A Fregean distinction in proof-theoretic semantics

    Full text link
    The origins of proof-theoretic semantics lie in the question of what constitutes the meaning of the logical connectives and its response: the rules of inference that govern the use of the connective. However, what if we go a step further and ask about the meaning of a proof as a whole? In this paper we address this question and lay out a framework to distinguish sense and denotation of proofs. Two questions are central here. First of all, if we have two (syntactically) different derivations, does this always lead to a difference, firstly, in sense, and secondly, in denotation? The other question is about the relation between different kinds of proof systems (here: natural deduction vs. sequent calculi) with respect to this distinction. Do the different forms of representing a proof necessarily correspond to a difference in how the inferential steps are given? In our framework it will be possible to identify denotation as well as sense of proofs not only within one proof system but also between different kinds of proof systems. Thus, we give an account to distinguish a mere syntactic divergence from a divergence in meaning and a divergence in meaning from a divergence of proof objects analogous to Frege's distinction for singular terms and sentences.Comment: Post-peer-review, pre-copyedit version of article, published version available open access under DOI: 10.1007/s10992-020-09577-

    Proof nets and the call-by-value λ-calculus

    Get PDF
    International audienceThis paper gives a detailed account of the relationship between (a variant of) the call-by-value lambda calculus and linear logic proof nets. The presentation is carefully tuned in order to realize an isomorphism between the two systems: every single rewriting step on the calculus maps to a single step on proof nets, and viceversa. In this way, we obtain an algebraic reformulation of proof nets. Moreover, we provide a simple correctness criterion for our proof nets, which employ boxes in an unusual way, and identify a subcalculus that is shown to be as expressive as the full calculus

    Types and verification for infinite state systems

    Get PDF
    Server-like or non-terminating programs are central to modern computing. It is a common requirement for these programs that they always be available to produce a behaviour. One method of showing such availability is by endowing a type-theory with constraints that demonstrate that a program will always produce some behaviour or halt. Such a constraint is often called productivity. We introduce a type theory which can be used to type-check a polymorphic functional programming language similar to a fragment of the Haskell programming language. This allows placing constraints on program terms such that they will not type-check unless they are productive. We show that using program transformation techniques, one can restructure some programs which are not provably productive in our type theory into programs which are manifestly productive. This allows greater programmer flexibility in the specification of such programs. We have demonstrated a mechanisation of some of these important results in the proof-assistant Coq. We have also written a program transformation system for this term-language in the programming language Haskell
    corecore