4,397 research outputs found

    Full Abstraction for the Resource Lambda Calculus with Tests, through Taylor Expansion

    Full text link
    We study the semantics of a resource-sensitive extension of the lambda calculus in a canonical reflexive object of a category of sets and relations, a relational version of Scott's original model of the pure lambda calculus. This calculus is related to Boudol's resource calculus and is derived from Ehrhard and Regnier's differential extension of Linear Logic and of the lambda calculus. We extend it with new constructions, to be understood as implementing a very simple exception mechanism, and with a "must" parallel composition. These new operations allow to associate a context of this calculus with any point of the model and to prove full abstraction for the finite sub-calculus where ordinary lambda calculus application is not allowed. The result is then extended to the full calculus by means of a Taylor Expansion formula. As an intermediate result we prove that the exception mechanism is not essential in the finite sub-calculus

    Normalizing the Taylor expansion of non-deterministic {\lambda}-terms, via parallel reduction of resource vectors

    Full text link
    It has been known since Ehrhard and Regnier's seminal work on the Taylor expansion of λ\lambda-terms that this operation commutes with normalization: the expansion of a λ\lambda-term is always normalizable and its normal form is the expansion of the B\"ohm tree of the term. We generalize this result to the non-uniform setting of the algebraic λ\lambda-calculus, i.e. λ\lambda-calculus extended with linear combinations of terms. This requires us to tackle two difficulties: foremost is the fact that Ehrhard and Regnier's techniques rely heavily on the uniform, deterministic nature of the ordinary λ\lambda-calculus, and thus cannot be adapted; second is the absence of any satisfactory generic extension of the notion of B\"ohm tree in presence of quantitative non-determinism, which is reflected by the fact that the Taylor expansion of an algebraic λ\lambda-term is not always normalizable. Our solution is to provide a fine grained study of the dynamics of β\beta-reduction under Taylor expansion, by introducing a notion of reduction on resource vectors, i.e. infinite linear combinations of resource λ\lambda-terms. The latter form the multilinear fragment of the differential λ\lambda-calculus, and resource vectors are the target of the Taylor expansion of λ\lambda-terms. We show the reduction of resource vectors contains the image of any β\beta-reduction step, from which we deduce that Taylor expansion and normalization commute on the nose. We moreover identify a class of algebraic λ\lambda-terms, encompassing both normalizable algebraic λ\lambda-terms and arbitrary ordinary λ\lambda-terms: the expansion of these is always normalizable, which guides the definition of a generalization of B\"ohm trees to this setting

    Call-by-value non-determinism in a linear logic type discipline

    Get PDF
    We consider the call-by-value lambda-calculus extended with a may-convergent non-deterministic choice and a must-convergent parallel composition. Inspired by recent works on the relational semantics of linear logic and non-idempotent intersection types, we endow this calculus with a type system based on the so-called Girard's second translation of intuitionistic logic into linear logic. We prove that a term is typable if and only if it is converging, and that its typing tree carries enough information to give a bound on the length of its lazy call-by-value reduction. Moreover, when the typing tree is minimal, such a bound becomes the exact length of the reduction

    ASMs and Operational Algorithmic Completeness of Lambda Calculus

    Get PDF
    We show that lambda calculus is a computation model which can step by step simulate any sequential deterministic algorithm for any computable function over integers or words or any datatype. More formally, given an algorithm above a family of computable functions (taken as primitive tools, i.e., kind of oracle functions for the algorithm), for every constant K big enough, each computation step of the algorithm can be simulated by exactly K successive reductions in a natural extension of lambda calculus with constants for functions in the above considered family. The proof is based on a fixed point technique in lambda calculus and on Gurevich sequential Thesis which allows to identify sequential deterministic algorithms with Abstract State Machines. This extends to algorithms for partial computable functions in such a way that finite computations ending with exceptions are associated to finite reductions leading to terms with a particular very simple feature.Comment: 37 page

    Taylor expansion for Call-By-Push-Value

    Get PDF
    The connection between the Call-By-Push-Value lambda-calculus introduced by Levy and Linear Logic introduced by Girard has been widely explored through a denotational view reflecting the precise ruling of resources in this language. We take a further step in this direction and apply Taylor expansion introduced by Ehrhard and Regnier. We define a resource lambda-calculus in whose terms can be used to approximate terms of Call-By-Push-Value. We show that this approximation is coherent with reduction and with the translations of Call-By-Name and Call-By-Value strategies into Call-By-Push-Value

    A call-by-need lambda-calculus with locally bottom-avoiding choice: context lemma and correctness of transformations

    Get PDF
    We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb, which is locally bottom-avoiding. We use a small-step operational semantics in form of a normal order reduction. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into an arbitrary program context their termination behaviour is the same. We use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We evolve different proof tools for proving correctness of program transformations. We provide a context lemma for may- as well as must- convergence which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations keep contextual equivalence. In contrast to other approaches our syntax as well as semantics does not make use of a heap for sharing expressions. Instead we represent these expressions explicitely via letrec-bindings

    On Constructor Rewrite Systems and the Lambda Calculus

    Full text link
    We prove that orthogonal constructor term rewrite systems and lambda-calculus with weak (i.e., no reduction is allowed under the scope of a lambda-abstraction) call-by-value reduction can simulate each other with a linear overhead. In particular, weak call-by- value beta-reduction can be simulated by an orthogonal constructor term rewrite system in the same number of reduction steps. Conversely, each reduction in a term rewrite system can be simulated by a constant number of beta-reduction steps. This is relevant to implicit computational complexity, because the number of beta steps to normal form is polynomially related to the actual cost (that is, as performed on a Turing machine) of normalization, under weak call-by-value reduction. Orthogonal constructor term rewrite systems and lambda-calculus are thus both polynomially related to Turing machines, taking as notion of cost their natural parameters.Comment: 27 pages. arXiv admin note: substantial text overlap with arXiv:0904.412
    corecore