17,060 research outputs found

    Termination of canonical context-sensitive rewriting and productivity of rewrite systems

    Full text link
    [EN] Termination of programs, i.e., the absence of infinite computations, ensures the existence of normal forms for all initial expressions, thus providing an essential ingredient for the definition of a normalization semantics for functional programs. In lazy functional languages, though, infinite data structures are often delivered as the outcome of computations. For instance, the list of all prime numbers can be returned as a neverending stream of numerical expressions or data structures. If such streams are allowed, requiring termination is hopeless. In this setting, the notion of productivity can be used to provide an account of computations with infinite data structures, as it "captures the idea of computability, of progress of infinite-list programs" (B.A. Sijtsma, On the Productivity of Recursive List Definitions, ACM Transactions on Programming Languages and Systems 11(4):633-649, 1989). However, in the realm of Term Rewriting Systems, which can be seen as (first-order, untyped, unconditional) functional programs, termination of Context-Sensitive Rewriting (CSR) has been showed equivalent to productivity of rewrite systems through appropriate transformations. In this way, tools for proving termination of CSR can be used to prove productivity. In term rewriting, CSR is the restriction of rewriting that arises when reductions are allowed on selected arguments of function symbols only. In this paper we show that well-known results about the computational power of CSR are useful to better understand the existing connections between productivity of rewrite systems and termination of CSR, and also to obtain more powerful techniques to prove productivity of rewrite systems.Partially supported by the EU (FEDER), Spanish MINECO TIN 2013-45732-C4-1-P, and GV PROMETEOII/2015/013.Lucas Alba, S. (2015). Termination of canonical context-sensitive rewriting and productivity of rewrite systems. Electronic Proceedings in Theoretical Computer Science. 200:18-31. https://doi.org/10.4204/EPTCS.200.2S183120

    Well-definedness of Streams by Transformation and Termination

    Get PDF
    Streams are infinite sequences over a given data type. A stream specification is a set of equations intended to define a stream. We propose a transformation from such a stream specification to a term rewriting system (TRS) in such a way that termination of the resulting TRS implies that the stream specification is well-defined, that is, admits a unique solution. As a consequence, proving well-definedness of several interesting stream specifications can be done fully automatically using present powerful tools for proving TRS termination. In order to increase the power of this approach, we investigate transformations that preserve semantics and well-definedness. We give examples for which the above mentioned technique applies for the ransformed specification while it fails for the original one

    Inductive and Coinductive Components of Corecursive Functions in Coq

    Get PDF
    In Constructive Type Theory, recursive and corecursive definitions are subject to syntactic restrictions which guarantee termination for recursive functions and productivity for corecursive functions. However, many terminating and productive functions do not pass the syntactic tests. Bove proposed in her thesis an elegant reformulation of the method of accessibility predicates that widens the range of terminative recursive functions formalisable in Constructive Type Theory. In this paper, we pursue the same goal for productive corecursive functions. Notably, our method of formalisation of coinductive definitions of productive functions in Coq requires not only the use of ad-hoc predicates, but also a systematic algorithm that separates the inductive and coinductive parts of functions.Comment: Dans Coalgebraic Methods in Computer Science (2008

    Beating the Productivity Checker Using Embedded Languages

    Full text link
    Some total languages, like Agda and Coq, allow the use of guarded corecursion to construct infinite values and proofs. Guarded corecursion is a form of recursion in which arbitrary recursive calls are allowed, as long as they are guarded by a coinductive constructor. Guardedness ensures that programs are productive, i.e. that every finite prefix of an infinite value can be computed in finite time. However, many productive programs are not guarded, and it can be nontrivial to put them in guarded form. This paper gives a method for turning a productive program into a guarded program. The method amounts to defining a problem-specific language as a data type, writing the program in the problem-specific language, and writing a guarded interpreter for this language.Comment: In Proceedings PAR 2010, arXiv:1012.455

    Coinductive Formal Reasoning in Exact Real Arithmetic

    Full text link
    In this article we present a method for formally proving the correctness of the lazy algorithms for computing homographic and quadratic transformations -- of which field operations are special cases-- on a representation of real numbers by coinductive streams. The algorithms work on coinductive stream of M\"{o}bius maps and form the basis of the Edalat--Potts exact real arithmetic. We use the machinery of the Coq proof assistant for the coinductive types to present the formalisation. The formalised algorithms are only partially productive, i.e., they do not output provably infinite streams for all possible inputs. We show how to deal with this partiality in the presence of syntactic restrictions posed by the constructive type theory of Coq. Furthermore we show that the type theoretic techniques that we develop are compatible with the semantics of the algorithms as continuous maps on real numbers. The resulting Coq formalisation is available for public download.Comment: 40 page

    Operational Semantics of Resolution and Productivity in Horn Clause Logic

    Get PDF
    This paper presents a study of operational and type-theoretic properties of different resolution strategies in Horn clause logic. We distinguish four different kinds of resolution: resolution by unification (SLD-resolution), resolution by term-matching, the recently introduced structural resolution, and partial (or lazy) resolution. We express them all uniformly as abstract reduction systems, which allows us to undertake a thorough comparative analysis of their properties. To match this small-step semantics, we propose to take Howard's System H as a type-theoretic semantic counterpart. Using System H, we interpret Horn formulas as types, and a derivation for a given formula as the proof term inhabiting the type given by the formula. We prove soundness of these abstract reduction systems relative to System H, and we show completeness of SLD-resolution and structural resolution relative to System H. We identify conditions under which structural resolution is operationally equivalent to SLD-resolution. We show correspondence between term-matching resolution for Horn clause programs without existential variables and term rewriting.Comment: Journal Formal Aspect of Computing, 201

    Foundational Extensible Corecursion

    Full text link
    This paper presents a formalized framework for defining corecursive functions safely in a total setting, based on corecursion up-to and relational parametricity. The end product is a general corecursor that allows corecursive (and even recursive) calls under well-behaved operations, including constructors. Corecursive functions that are well behaved can be registered as such, thereby increasing the corecursor's expressiveness. The metatheory is formalized in the Isabelle proof assistant and forms the core of a prototype tool. The corecursor is derived from first principles, without requiring new axioms or extensions of the logic

    Implicit complexity for coinductive data: a characterization of corecurrence

    Full text link
    We propose a framework for reasoning about programs that manipulate coinductive data as well as inductive data. Our approach is based on using equational programs, which support a seamless combination of computation and reasoning, and using productivity (fairness) as the fundamental assertion, rather than bi-simulation. The latter is expressible in terms of the former. As an application to this framework, we give an implicit characterization of corecurrence: a function is definable using corecurrence iff its productivity is provable using coinduction for formulas in which data-predicates do not occur negatively. This is an analog, albeit in weaker form, of a characterization of recurrence (i.e. primitive recursion) in [Leivant, Unipolar induction, TCS 318, 2004].Comment: In Proceedings DICE 2011, arXiv:1201.034

    The Guarded Lambda-Calculus: Programming and Reasoning with Guarded Recursion for Coinductive Types

    Full text link
    We present the guarded lambda-calculus, an extension of the simply typed lambda-calculus with guarded recursive and coinductive types. The use of guarded recursive types ensures the productivity of well-typed programs. Guarded recursive types may be transformed into coinductive types by a type-former inspired by modal logic and Atkey-McBride clock quantification, allowing the typing of acausal functions. We give a call-by-name operational semantics for the calculus, and define adequate denotational semantics in the topos of trees. The adequacy proof entails that the evaluation of a program always terminates. We introduce a program logic with L\"ob induction for reasoning about the contextual equivalence of programs. We demonstrate the expressiveness of the calculus by showing the definability of solutions to Rutten's behavioural differential equations.Comment: Accepted to Logical Methods in Computer Science special issue on the 18th International Conference on Foundations of Software Science and Computation Structures (FoSSaCS 2015
    corecore