8 research outputs found

    Linear Logic and Strong Normalization

    Get PDF
    Strong normalization for linear logic requires elaborated rewriting techniques. In this paper we give a new presentation of MELL proof nets, without any commutative cut-elimination rule. We show how this feature induces a compact and simple proof of strong normalization, via reducibility candidates. It is the first proof of strong normalization for MELL which does not rely on any form of confluence, and so it smoothly scales up to full linear logic. Moreover, it is an axiomatic proof, as more generally it holds for every set of rewriting rules satisfying three very natural requirements with respect to substitution, commutation with promotion, full composition, and Kesner\u27s IE property. The insight indeed comes from the theory of explicit substitutions, and from looking at the exponentials as a substitution device

    A semantic account of strong normalization in Linear Logic

    Full text link
    We prove that given two cut free nets of linear logic, by means of their relational interpretations one can: 1) first determine whether or not the net obtained by cutting the two nets is strongly normalizable 2) then (in case it is strongly normalizable) compute the maximal length of the reduction sequences starting from that net.Comment: 41 page

    Observed communication semantics for classical processes

    Get PDF
    Classical Linear Logic (CLL) has long inspired readings of its proofs as communicating processes. Wadler's CP calculus is one of these readings. Wadler gave CP an operational semantics by selecting a subset of the cut-elimination rules of CLL to use as reduction rules. This semantics has an appealing close connection to the logic, but does not resolve the status of the other cut-elimination rules, and does not admit an obvious notion of observational equivalence. We propose a new operational semantics for CP based on the idea of observing communication, and use this semantics to define an intuitively reasonable notion of observational equivalence. To reason about observational equivalence, we use the standard relational denotational semantics of CLL. We show that this denotational semantics is adequate for our operational semantics. This allows us to deduce that, for instance, all the cut-elimination rules of CLL are observational equivalences

    Linear β-reduction

    Get PDF
    Linear head reduction is a key tool for the analysis of reduction machines for lambda-calculus and for game semantics. Its definition requires a notion of redex at a distance named primary redex in the literature. Nevertheless, a clear and complete syntactic analysis of this rule is missing. We present here a general notion of beta-reduction at a distance and of linear reduction (i.e., not restricted to the head variable), and we analyse their relations and properties. This analysis rests on a variant of the so-called sigma-equivalence that is more suitable for the analysis of reduction machines, since the position along the spine of primary redexes is not permuted. We finally show that, in the simply typed case, the proof of strong normalisation of linear reduction can be obtained by a trivial tuning of Gandy's proof for strong normalisation of beta-reduction

    Proof-Net as Graph, Taylor Expansion as Pullback

    Get PDF
    We introduce a new graphical representation for multiplicative and exponential linear logic proof-structures, based only on standard labelled oriented graphs and standard notions of graph theory. The inductive structure of boxes is handled by means of a box-tree. Our proof-structures are canonical and allows for an elegant definition of their Taylor expansion by means of pullbacks

    On the operational theory of the CPS-calculus: Towards a theoretical foundation for IRs

    Get PDF
    The continuation-passing style translation often employed by compilers gives rise to a class of intermediate representation languages where functions are not allowed to return anymore. Though the primary use of these intermediate representation languages is to expose details about a program’s control flow, they may be equipped with an equational theory in order to be seen as specialized calculi, which in turn may be related to the original languages by means of a factorization theorem. In this paper, we explore Thielecke’s CPS-calculus, a small theory of continuations inspired by compiler implementations, and study its metatheory. We extend it with a sound reduction semantics that faithfully represents optimization rules used in actual compilers, and prove that it acts as a suitable theoretical foundation for the intermediate representation of Appel’s and Kennedy’s compilers by following the guidelines set out by Plotkin. Finally, we prove that the CPS-calculus is strongly normalizing in the simply typed setting by using a novel proof method for reasoning about reducibility at a distance, from which logical consistency follows. Taken together, these results close a gap in the existing literature, providing a formal theory for reasoning about intermediate representations

    Exponentials as Substitutions and the Cost of Cut Elimination in Linear Logic

    Get PDF
    This paper introduces the exponential substitution calculus (ESC), a new presentation of cut elimination for IMELL, based on proof terms and building on the idea that exponentials can be seen as explicit substitutions. The idea in itself is not new, but here it is pushed to a new level, inspired by Accattoli and Kesner's linear substitution calculus (LSC). One of the key properties of the LSC is that it naturally models the sub-term property of abstract machines, that is the key ingredient for the study of reasonable time cost models for the λ\lambda-calculus. The new ESC is then used to design a cut elimination strategy with the sub-term property, providing the first polynomial cost model for cut elimination with unconstrained exponentials. For the ESC, we also prove untyped confluence and typed strong normalization, showing that it is an alternative to proof nets for an advanced study of cut elimination

    Extending Implicit Computational Complexity and Abstract Machines to Languages with Control

    Get PDF
    The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context
    corecore