266 research outputs found

    An estimation for the lengths of reduction sequences of the λμρθ\lambda\mu\rho\theta-calculus

    Full text link
    Since it was realized that the Curry-Howard isomorphism can be extended to the case of classical logic as well, several calculi have appeared as candidates for the encodings of proofs in classical logic. One of the most extensively studied among them is the λμ\lambda\mu-calculus of Parigot. In this paper, based on the result of Xi presented for the λ\lambda-calculus Xi, we give an upper bound for the lengths of the reduction sequences in the λμ\lambda\mu-calculus extended with the ρ\rho- and θ\theta-rules. Surprisingly, our results show that the new terms and the new rules do not add to the computational complexity of the calculus despite the fact that μ\mu-abstraction is able to consume an unbounded number of arguments by virtue of the μ\mu-rule

    Derivation Lengths Classification of G\"odel's T Extending Howard's Assignment

    Get PDF
    Let T be Goedel's system of primitive recursive functionals of finite type in the lambda formulation. We define by constructive means using recursion on nested multisets a multivalued function I from the set of terms of T into the set of natural numbers such that if a term a reduces to a term b and if a natural number I(a) is assigned to a then a natural number I(b) can be assigned to b such that I(a) is greater than I(b). The construction of I is based on Howard's 1970 ordinal assignment for T and Weiermann's 1996 treatment of T in the combinatory logic version. As a corollary we obtain an optimal derivation length classification for the lambda formulation of T and its fragments. Compared with Weiermann's 1996 exposition this article yields solutions to several non-trivial problems arising from dealing with lambda terms instead of combinatory logic terms. It is expected that the methods developed here can be applied to other higher order rewrite systems resulting in new powerful termination orderings since T is a paradigm for such systems

    Extracting Herbrand Disjunctions by Functional Interpretation

    Get PDF
    Carrying out a suggestion by Kreisel, we adapt Gödel's functional interpretation to ordinary first-order predicate logic(PL) and thus devise an algorithm to extract Herbrand terms from PL-proofs. The extraction is carried out in an extension of PL to higher types. The algorithm consists of two main steps: first we extract a functional realizer, next we compute the beta-normal-form of the realizer from which the Herbrand terms can be read off. Even though the extraction is carried out in the extended language, the terms are ordinary PL-terms. In contrast to approaches to Herbrand's theorem based on cut elimination or epsilon-elimination this extraction technique is, except for the normalization step, of low polynomial complexity, fully modular and furthermore allows an analysis of the structure of the Herbrand terms, in the spirit of Kreisel, already prior to the normalization step. It is expected that the implementation of functional interpretation in Schwichtenberg's MINLOG system can be adapted to yield an efficient Herbrand-term extraction tool

    The Bang Calculus Revisited

    Full text link
    Call-by-Push-Value (CBPV) is a programming paradigm subsuming both Call-by-Name (CBN) and Call-by-Value (CBV) semantics. The paradigm was recently modelled by means of the Bang Calculus, a term language connecting CBPV and Linear Logic. This paper presents a revisited version of the Bang Calculus, called λ!\lambda !, enjoying some important properties missing in the original system. Indeed, the new calculus integrates commutative conversions to unblock value redexes while being confluent at the same time. A second contribution is related to non-idempotent types. We provide a quantitative type system for our λ!\lambda !-calculus, and we show that the length of the (weak) reduction of a typed term to its normal form \emph{plus} the size of this normal form is bounded by the size of its type derivation. We also explore the properties of this type system with respect to CBN/CBV translations. We keep the original CBN translation from λ\lambda-calculus to the Bang Calculus, which preserves normal forms and is sound and complete with respect to the (quantitative) type system for CBN. However, in the case of CBV, we reformulate both the translation and the type system to restore two main properties: preservation of normal forms and completeness. Last but not least, the quantitative system is refined to a \emph{tight} one, which transforms the previous upper bound on the length of reduction to normal form plus its size into two independent \emph{exact} measures for them

    A complete proof of the safety of Nöcker's strictness analysis

    Get PDF
    This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell

    Types as Resources for Classical Natural Deduction

    Get PDF
    We define two resource aware typing systems for the lambda-mu-calculus based on non-idempotent intersection and union types. The non-idempotent approach provides very simple combinatorial arguments - based on decreasing measures of type derivations - to characterize head and strongly normalizing terms. Moreover, typability provides upper bounds for the length of head-reduction sequences and maximal reduction sequences

    Asymptotically almost all \lambda-terms are strongly normalizing

    Full text link
    We present quantitative analysis of various (syntactic and behavioral) properties of random \lambda-terms. Our main results are that asymptotically all the terms are strongly normalizing and that any fixed closed term almost never appears in a random term. Surprisingly, in combinatory logic (the translation of the \lambda-calculus into combinators), the result is exactly opposite. We show that almost all terms are not strongly normalizing. This is due to the fact that any fixed combinator almost always appears in a random combinator
    corecore