16 research outputs found

    Matrix Interpretations on Polyhedral Domains

    Get PDF
    We refine matrix interpretations for proving termination and complexity bounds of term rewrite systems we restricting them to domains that satisfy a system of linear inequalities. Admissibility of such a restriction is shown by certificates whose validity can be expressed as a constraint program. This refinement is orthogonal to other features of matrix interpretations (complexity bounds, dependency pairs), but can be used to improve complexity bounds, and we discuss its relation with the usable rules criterion. We present an implementation and experiments

    On the enumeration of closures and environments with an application to random generation

    Get PDF
    Environments and closures are two of the main ingredients of evaluation in lambda-calculus. A closure is a pair consisting of a lambda-term and an environment, whereas an environment is a list of lambda-terms assigned to free variables. In this paper we investigate some dynamic aspects of evaluation in lambda-calculus considering the quantitative, combinatorial properties of environments and closures. Focusing on two classes of environments and closures, namely the so-called plain and closed ones, we consider the problem of their asymptotic counting and effective random generation. We provide an asymptotic approximation of the number of both plain environments and closures of size nn. Using the associated generating functions, we construct effective samplers for both classes of combinatorial structures. Finally, we discuss the related problem of asymptotic counting and random generation of closed environemnts and closures

    A new coinductive confluence proof for infinitary lambda calculus

    Full text link
    We present a new and formal coinductive proof of confluence and normalisation of B\"ohm reduction in infinitary lambda calculus. The proof is simpler than previous proofs of this result. The technique of the proof is new, i.e., it is not merely a coinductive reformulation of any earlier proofs. We formalised the proof in the Coq proof assistant.Comment: arXiv admin note: text overlap with arXiv:1501.0435

    Confluence of nearly orthogonal infinitary term rewriting systems

    Get PDF
    We give a relatively simple coinductive proof of confluence, modulo equivalence of root-active terms, of nearly orthogonal infinitary term rewriting systems. Nearly orthogonal systems allow certain root overlaps, but no non-root overlaps. Using a slightly more complicated method we also show confluence modulo equivalence of hypercollapsing terms. The condition we impose on root overlaps is similar to the condition used by Toyama in the context of finitary rewriting

    (Leftmost-Outermost) Beta Reduction is Invariant, Indeed

    Get PDF
    Slot and van Emde Boas' weak invariance thesis states that reasonable machines can simulate each other within a polynomially overhead in time. Is lambda-calculus a reasonable machine? Is there a way to measure the computational complexity of a lambda-term? This paper presents the first complete positive answer to this long-standing problem. Moreover, our answer is completely machine-independent and based over a standard notion in the theory of lambda-calculus: the length of a leftmost-outermost derivation to normal form is an invariant cost model. Such a theorem cannot be proved by directly relating lambda-calculus with Turing machines or random access machines, because of the size explosion problem: there are terms that in a linear number of steps produce an exponentially long output. The first step towards the solution is to shift to a notion of evaluation for which the length and the size of the output are linearly related. This is done by adopting the linear substitution calculus (LSC), a calculus of explicit substitutions modeled after linear logic proof nets and admitting a decomposition of leftmost-outermost derivations with the desired property. Thus, the LSC is invariant with respect to, say, random access machines. The second step is to show that LSC is invariant with respect to the lambda-calculus. The size explosion problem seems to imply that this is not possible: having the same notions of normal form, evaluation in the LSC is exponentially longer than in the lambda-calculus. We solve such an impasse by introducing a new form of shared normal form and shared reduction, deemed useful. Useful evaluation avoids those steps that only unshare the output without contributing to beta-redexes, i.e. the steps that cause the blow-up in size. The main technical contribution of the paper is indeed the definition of useful reductions and the thorough analysis of their properties.Comment: arXiv admin note: substantial text overlap with arXiv:1405.331

    Termination of canonical context-sensitive rewriting and productivity of rewrite systems

    Full text link
    [EN] Termination of programs, i.e., the absence of infinite computations, ensures the existence of normal forms for all initial expressions, thus providing an essential ingredient for the definition of a normalization semantics for functional programs. In lazy functional languages, though, infinite data structures are often delivered as the outcome of computations. For instance, the list of all prime numbers can be returned as a neverending stream of numerical expressions or data structures. If such streams are allowed, requiring termination is hopeless. In this setting, the notion of productivity can be used to provide an account of computations with infinite data structures, as it "captures the idea of computability, of progress of infinite-list programs" (B.A. Sijtsma, On the Productivity of Recursive List Definitions, ACM Transactions on Programming Languages and Systems 11(4):633-649, 1989). However, in the realm of Term Rewriting Systems, which can be seen as (first-order, untyped, unconditional) functional programs, termination of Context-Sensitive Rewriting (CSR) has been showed equivalent to productivity of rewrite systems through appropriate transformations. In this way, tools for proving termination of CSR can be used to prove productivity. In term rewriting, CSR is the restriction of rewriting that arises when reductions are allowed on selected arguments of function symbols only. In this paper we show that well-known results about the computational power of CSR are useful to better understand the existing connections between productivity of rewrite systems and termination of CSR, and also to obtain more powerful techniques to prove productivity of rewrite systems.Partially supported by the EU (FEDER), Spanish MINECO TIN 2013-45732-C4-1-P, and GV PROMETEOII/2015/013.Lucas Alba, S. (2015). Termination of canonical context-sensitive rewriting and productivity of rewrite systems. Electronic Proceedings in Theoretical Computer Science. 200:18-31. https://doi.org/10.4204/EPTCS.200.2S183120

    Applications and extensions of context-sensitive rewriting

    Full text link
    [EN] Context-sensitive rewriting is a restriction of term rewriting which is obtained by imposing replacement restrictions on the arguments of function symbols. It has proven useful to analyze computational properties of programs written in sophisticated rewriting-based programming languages such asCafeOBJ, Haskell, Maude, OBJ*, etc. Also, a number of extensions(e.g., to conditional rewritingor constrained equational systems) and generalizations(e.g., controlled rewritingor forbidden patterns) of context-sensitive rewriting have been proposed. In this paper, we provide an overview of these applications and related issues. (C) 2021 Elsevier Inc. All rights reserved.Partially supported by the EU (FEDER), and projects RTI2018-094403-B-C32 and PROMETEO/2019/098.Lucas Alba, S. (2021). Applications and extensions of context-sensitive rewriting. Journal of Logical and Algebraic Methods in Programming. 121:1-33. https://doi.org/10.1016/j.jlamp.2021.10068013312

    Expressing Ecumenical Systems in the ??-Calculus Modulo Theory

    Get PDF
    Systems in which classical and intuitionistic logics coexist are called ecumenical. Such a system allows for interoperability and hybridization between classical and constructive propositions and proofs. We study Ecumenical STT, a theory expressed in the logical framework of the ??-calculus modulo theory. We prove soudness and conservativity of four subtheories of Ecumenical STT with respect to constructive and classical predicate logic and simple type theory. We also prove the weak normalization of well-typed terms and thus the consistency of Ecumenical STT

    On the enumeration of closures and environments with an application to random generation

    Get PDF
    Environments and closures are two of the main ingredients of evaluation in lambda-calculus. A closure is a pair consisting of a lambda-term and an environment, whereas an environment is a list of lambda-terms assigned to free variables. In this paper we investigate some dynamic aspects of evaluation in lambda-calculus considering the quantitative, combinatorial properties of environments and closures. Focusing on two classes of environments and closures, namely the so-called plain and closed ones, we consider the problem of their asymptotic counting and effective random generation. We provide an asymptotic approximation of the number of both plain environments and closures of size nn. Using the associated generating functions, we construct effective samplers for both classes of combinatorial structures. Finally, we discuss the related problem of asymptotic counting and random generation of closed environemnts and closures
    corecore