10,319 research outputs found

    Program equivalence for a concurrent lambda calculus with futures

    Get PDF
    Reasoning about the correctness of program transformations requires a notion of program equivalence. We present an observational semantics for the concurrent lambda calculus with futures Lambda(fut), which formalizes the operational semantics of the programming language Alice ML. We show that natural program optimizations, as well as partial evaluation with respect to deterministic rules, are correct for Lambda(fut). This relies on a number of fundamental properties that we establish for our observational semantics

    A Simple Take on Typed Abstract Syntax in Haskell-like Languages

    Get PDF
    We present a simple way to program typed abstract syntax in a language following a Hindley-Milner typing discipline, such as Haskell and ML, and we apply it to automate two proofs about normalization functions as embodied in type-directed partial evaluation for the simply typed lambda calculus: normalization functions (1) preserve types and (2) yield long beta-eta normal forms.Keywords: Type-directed partial evaluation, normalization functions, simply-typed lambda-calculus, higher-order abstract syntax, Haskell

    Continuation-passing Style Models Complete for Intuitionistic Logic

    Get PDF
    A class of models is presented, in the form of continuation monads polymorphic for first-order individuals, that is sound and complete for minimal intuitionistic predicate logic. The proofs of soundness and completeness are constructive and the computational content of their composition is, in particular, a β\beta-normalisation-by-evaluation program for simply typed lambda calculus with sum types. Although the inspiration comes from Danvy's type-directed partial evaluator for the same lambda calculus, the there essential use of delimited control operators (i.e. computational effects) is avoided. The role of polymorphism is crucial -- dropping it allows one to obtain a notion of model complete for classical predicate logic. The connection between ours and Kripke models is made through a strengthening of the Double-negation Shift schema

    Eta-Expansion Does The Trick (Revised Version)

    Get PDF
    Partial-evaluation folklore has it that massaging one's source programs can make them specialize better. In Jones, Gomard, and Sestoft's recent textbook, a whole chapter is dedicated to listing such "binding-time improvements": nonstandard use of continuation passing style, eta-expansion, and a popular transformation called "The Trick". We provide a unified view of these binding-time improvements, from a typing perspective.Just as a proper treatment of product values in partial evaluationrequires partially static values, a proper treatment of disjoint sums requires moving static contexts across dynamic case expressions. This requirement precisely accounts for the nonstandard use of continuation-passing style encountered in partial evaluation. Eta-expansion thusacts as a uniform binding-time coercion between values and contexts, be they of function type, product type, or disjoint-sum type. For the latter case, it enables "The Trick".In this article, we extend Gomard and Jones's partial evaluatorfor the lambda-calculus, lambda-Mix, with products and disjoint sums; we pointout how eta-expansion for (finite) disjoint sums enables The Trick; we generalize our earlier work by identifying that eta-expansion can be obtained in the binding-time analysis simply by adding two coercion rules; and we specify and prove the correctness of our extension to lambda-Mix.Keywords: Partial evaluation, binding-time analysis, program specialization,binding-time improvement, eta-expansion, static reduction

    Lazy Evaluation and Delimited Control

    Full text link
    The call-by-need lambda calculus provides an equational framework for reasoning syntactically about lazy evaluation. This paper examines its operational characteristics. By a series of reasoning steps, we systematically unpack the standard-order reduction relation of the calculus and discover a novel abstract machine definition which, like the calculus, goes "under lambdas." We prove that machine evaluation is equivalent to standard-order evaluation. Unlike traditional abstract machines, delimited control plays a significant role in the machine's behavior. In particular, the machine replaces the manipulation of a heap using store-based effects with disciplined management of the evaluation stack using control-based effects. In short, state is replaced with control. To further articulate this observation, we present a simulation of call-by-need in a call-by-value language using delimited control operations

    The exp-log normal form of types

    Get PDF
    Lambda calculi with algebraic data types lie at the core of functional programming languages and proof assistants, but conceal at least two fundamental theoretical problems already in the presence of the simplest non-trivial data type, the sum type. First, we do not know of an explicit and implemented algorithm for deciding the beta-eta-equality of terms---and this in spite of the first decidability results proven two decades ago. Second, it is not clear how to decide when two types are essentially the same, i.e. isomorphic, in spite of the meta-theoretic results on decidability of the isomorphism. In this paper, we present the exp-log normal form of types---derived from the representation of exponential polynomials via the unary exponential and logarithmic functions---that any type built from arrows, products, and sums, can be isomorphically mapped to. The type normal form can be used as a simple heuristic for deciding type isomorphism, thanks to the fact that it is a systematic application of the high-school identities. We then show that the type normal form allows to reduce the standard beta-eta equational theory of the lambda calculus to a specialized version of itself, while preserving the completeness of equality on terms. We end by describing an alternative representation of normal terms of the lambda calculus with sums, together with a Coq-implemented converter into/from our new term calculus. The difference with the only other previously implemented heuristic for deciding interesting instances of eta-equality by Balat, Di Cosmo, and Fiore, is that we exploit the type information of terms substantially and this often allows us to obtain a canonical representation of terms without performing sophisticated term analyses

    An Invariant Cost Model for the Lambda Calculus

    Full text link
    We define a new cost model for the call-by-value lambda-calculus satisfying the invariance thesis. That is, under the proposed cost model, Turing machines and the call-by-value lambda-calculus can simulate each other within a polynomial time overhead. The model only relies on combinatorial properties of usual beta-reduction, without any reference to a specific machine or evaluator. In particular, the cost of a single beta reduction is proportional to the difference between the size of the redex and the size of the reduct. In this way, the total cost of normalizing a lambda term will take into account the size of all intermediate results (as well as the number of steps to normal form).Comment: 19 page

    Fourier Series Formalization in ACL2(r)

    Get PDF
    We formalize some basic properties of Fourier series in the logic of ACL2(r), which is a variant of ACL2 that supports reasoning about the real and complex numbers by way of non-standard analysis. More specifically, we extend a framework for formally evaluating definite integrals of real-valued, continuous functions using the Second Fundamental Theorem of Calculus. Our extended framework is also applied to functions containing free arguments. Using this framework, we are able to prove the orthogonality relationships between trigonometric functions, which are the essential properties in Fourier series analysis. The sum rule for definite integrals of indexed sums is also formalized by applying the extended framework along with the First Fundamental Theorem of Calculus and the sum rule for differentiation. The Fourier coefficient formulas of periodic functions are then formalized from the orthogonality relations and the sum rule for integration. Consequently, the uniqueness of Fourier sums is a straightforward corollary. We also present our formalization of the sum rule for definite integrals of infinite series in ACL2(r). Part of this task is to prove the Dini Uniform Convergence Theorem and the continuity of a limit function under certain conditions. A key technique in our proofs of these theorems is to apply the overspill principle from non-standard analysis.Comment: In Proceedings ACL2 2015, arXiv:1509.0552

    Program transformation for functional circuit descriptions

    Get PDF
    We model sequential synchronous circuits on the logical level by signal-processing programs in an extended lambda calculus Lpor with letrec, constructors, case and parallel or (por) employing contextual equivalence. The model describes gates as (parallel) boolean operators, memory using a delay, which in turn is modeled as a shift of the list of signals, and permits also constructive cycles due to the parallel or. It opens the possibility of a large set of program transformations that correctly transform the expressions and thus the represented circuits and provides basic tools for equivalence testing and optimizing circuits. A further application is the correct manipulation by transformations of software components combined with circuits. The main part of our work are proof methods for correct transformations of expressions in the lambda calculus Lpor, and to propose the appropriate program transformations
    • …
    corecore