47 research outputs found

    Light Logics and the Call-by-Value Lambda Calculus

    Full text link
    The so-called light logics have been introduced as logical systems enjoying quite remarkable normalization properties. Designing a type assignment system for pure lambda calculus from these logics, however, is problematic. In this paper we show that shifting from usual call-by-name to call-by-value lambda calculus allows regaining strong connections with the underlying logic. This will be done in the context of Elementary Affine Logic (EAL), designing a type system in natural deduction style assigning EAL formulae to lambda terms.Comment: 28 page

    Parallel Beta Reduction Is Not Elementary Recursive

    Get PDF
    AbstractWe analyze the inherent complexity of implementing Lévy's notion of optimal evaluation for the λ-calculus, where similar redexes are contracted in one step via so-called parallel β-reduction. Optimal evaluation was finally realized by Lamping, who introduced a beautiful graph reduction technology for sharing evaluation contexts dual to the sharing of values. His pioneering insights have been modified and improved in subsequent implementations of optimal reduction. We prove that the cost of parallel β-reduction is not bounded by any Kalmár-elementary recursive function. Not only do we establish that the parallel β-step cannot be a unit-cost operation, we demonstrate that the time complexity of implementing a sequence of n parallel β-steps is not bounded as O(2n), O(22n), O(222n), or in general, O(Kl(n)), where Kl(n) is a fixed stack of l 2's with an n on top. A key insight, essential to the establishment of this non-elementary lower bound, is that any simply typed λ-term can be reduced to normal form in a number of parallel β-steps that is only polynomial in the length of the explicitly typed term. The result follows from Statman's theorem that deciding equivalence of typed λ-terms is not elementary recursive. The main theorem gives a lower bound on the work that must be done by any technology that implements Lévy's notion of optimal reduction. However, in the significant case of Lamping's solution, we make some important remarks addressing how work done by β-reduction is translated into equivalent work carried out by his bookkeeping nodes. In particular, we identify the computational paradigms of superposition of values and of higher-order sharing, appealing to compelling analogies with quantum mechanics and SIMD-parallelism

    Soft Session Types

    Get PDF
    We show how systems of session types can enforce interactions to be bounded for all typable processes. The type system we propose is based on Lafont's soft linear logic and is strongly inspired by recent works about session types as intuitionistic linear logic formulas. Our main result is the existence, for every typable process, of a polynomial bound on the length of any reduction sequence starting from it and on the size of any of its reducts.Comment: In Proceedings EXPRESS 2011, arXiv:1108.407

    Functional Geometry and the Traite de Lutherie

    No full text
    We propose to design, implement, and experiment with a programming language for describing how to draw string instrument outlines: violins, violas, and especially violoncellos. Based on the historical reconstruction in Francois Denis's definitive monograph, Traite de Lutherie, using straightedge and compass constructions, the software can enhance insights into techniques of eighteenth-century design, provide an archival format for describing the properties of string instrument outlines, and the instructions for generating highly accurate digital drawings for use in construction. Further, it can provide the foundation for a kind of computational art history, where the language and associated software serve as a descriptive tool for analyzing the evolution of instrument designs over time. This work will be integrated with ongoing, active experience constructing violoncellos, connecting the historical and conceptual with the practical

    Programming language foundations of computation theory

    No full text

    A simple proof of a theorem of Statman

    Get PDF
    In this note, we reprove a theorem of Statman that deciding the fij-equality of two firstorder typable -terms is not elementary recursive [Sta79]. The basic idea of our proof, like that of Statman's, is the Henkin quantifier elimination procedure [Hen63]. However, our coding is much simpler, and easy to understand. 1 Introduction A well known theorem of Richard Statman states that if we have two -terms that are first-order typable, deciding whether the terms reduce to the same normal form is not Kalmar elementary: namely, it cannot be decided in f k (n) steps for any fixed integer k 0, where n is the length of the two terms, and f 0 (n) = n, f t+1 (n) = 2 f t (n) . The theorem is often cited, but in contrast, its proof is not well understood. In this note, we give a simple proof of the theorem. The key idea that vastly simplifies the technical details of the proof is to use list iteration as a quantifier elimination procedure. 2 Preliminaries 2.1 Deciding truth of formulas in high..

    A Constructive Logic of Multiple Subtyping

    No full text
    We show how a higher order logic, the calculus of constructions, can be used to give a simple, first principles treatment of record calculi, polymorphism, and subtyping. The development follows the constructive idiom of extracting implementations of equationally specified programs from proofs of their termination, with a logic for reasoning about programs, and a semantics that comes as a gratuity. In this framework, records are finitely specified functions where equality is decidable over the domain, with types that are a particular kind of logical assertion. By proving that a record specification satisfies its type, we can extract its implementation. While program extraction serves as a sort of compiler, proof normalization serves as an interpreter; the latter serves to ensure in some sense the coherence of the translation embedded in the former. This simple minded approach lets us show, for example, that many inference rules found in record and object calculi can be derived --- they ..

    Quantifier Elimination and Parametric Polymorphism in Programming Languages

    No full text
    We present a simple and easy to understand explanation of ML type inference and parametric polymorphism within the framework of type monomorphism, as in the first order typed lambda calculus. We prove the equivalence of this system with the standard interpretation using type polymorphism, and extend the equivalence to include polymorphic fixpoints. The monomorphic interpretation gives a purely combinatorial understanding of the type inference problem, and is a classic instance of quantifier elimination, as well as an example of Gentzenstyle cut elimination in the framework of the Curry-Howard propositions-as-types analogy. Supported by NSF Grant CCR-9017125, and grants from Texas Instruments and from the Tyson Foundation. 1 Introduction In his influential paper, "A theory of type polymorphism in programming," Robin Milner proposed an extension to the first order typed -calculus which has become known as the core of the ML programming language [Mil78, HMT90]. The extension augment..
    corecore