2 research outputs found

    An Invariant Cost Model for the Lambda Calculus

    Full text link
    We define a new cost model for the call-by-value lambda-calculus satisfying the invariance thesis. That is, under the proposed cost model, Turing machines and the call-by-value lambda-calculus can simulate each other within a polynomial time overhead. The model only relies on combinatorial properties of usual beta-reduction, without any reference to a specific machine or evaluator. In particular, the cost of a single beta reduction is proportional to the difference between the size of the redex and the size of the reduct. In this way, the total cost of normalizing a lambda term will take into account the size of all intermediate results (as well as the number of steps to normal form).Comment: 19 page

    What is an Efficient Implementation of the lambda-calculus?

    No full text
    We propose to measure the efficiency of any implementation of the lambda-calculus as a function of a new parameter mu, that is itself a function of any lambda-expression. Complexity is expressed here as a function of nu just as runtime is expressed as a function of the input size n in ordinary analysis of algorithms. This enables implementations to be compared for worst case efficiency. We argue that any implementation must have complexity Omega(nu), i.e. a linear lower bound. Furthermore, we show that implementations based upon Turner Combinators of Hughes Super-combinators have complexities 2Omega(nu), i.e. an exponential lower bound. It is open whether any implementation of polynomial complexity, nu^0(1), exists, although some implementations have been implicitly claimed to have this complexity
    corecore