72 research outputs found

    Complexity Bounds for Ordinal-Based Termination

    Full text link
    `What more than its truth do we know if we have a proof of a theorem in a given formal system?' We examine Kreisel's question in the particular context of program termination proofs, with an eye to deriving complexity bounds on program running times. Our main tool for this are length function theorems, which provide complexity bounds on the use of well quasi orders. We illustrate how to prove such theorems in the simple yet until now untreated case of ordinals. We show how to apply this new theorem to derive complexity bounds on programs when they are proven to terminate thanks to a ranking function into some ordinal.Comment: Invited talk at the 8th International Workshop on Reachability Problems (RP 2014, 22-24 September 2014, Oxford

    Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion

    Get PDF
    We present extensions of our Isabelle Formalization of Rewriting that cover two historically related concepts: the Knuth-Bendix order and the Knuth-Bendix completion procedure. The former, besides being the first development of its kind in a proof assistant, is based on a generalized version of the Knuth-Bendix order. We compare our version to variants from the literature and show all properties required to certify termination proofs of TRSs. The latter comprises the formalization of important facts that are related to completion, like Birkhoff\u27s theorem, the critical pair theorem, and a soundness proof of completion, showing that the strict encompassment condition is superfluous for finite runs. As a result, we are able to certify completion proofs

    Proof Theory at Work: Complexity Analysis of Term Rewrite Systems

    Full text link
    This thesis is concerned with investigations into the "complexity of term rewriting systems". Moreover the majority of the presented work deals with the "automation" of such a complexity analysis. The aim of this introduction is to present the main ideas in an easily accessible fashion to make the result presented accessible to the general public. Necessarily some technical points are stated in an over-simplified way.Comment: Cumulative Habilitation Thesis, submitted to the University of Innsbruc

    Effective termination techniques

    Get PDF
    An important property of term rewriting systems is termination: the guarantee that every rewrite sequence is finite. This thesis is concerned with orderings used for proving termination, in particular the Knuth-Bendix and polynomial orderings. First, two methods for generating termination orderings are enhanced. The Knuth-Bendix ordering algorithm incrementally generates numeric and symbolic constraints that are sufficient for the termination of the rewrite system being constructed. The KB ordering algorithm requires an efficient linear constraint solver that detects the nature of degeneracy in the solution space, and for this a revised method of complete description is presented that eliminates the space redundancy that crippled previous implementations. Polynomial orderings are more powerful than Knuth-Bendix orderings, but are usually much harder to generate. Rewrite systems consisting of only a handful of rules can overwhelm existing search techniques due to the combinatorial complexity. A genetic algorithm is applied with some success. Second, a subset of the family of polynomial orderings is analysed. The polynomial orderings on terms in two unary function symbols are fully resolved into simpler orderings. Thus it is shown that most of the complexity of polynomial orderings is redundant. The order type (logical invariant), either r or A (numeric invariant), and precedence is calculated for each polynomial ordering. The invariants correspond in a natural way to the parameters of the orderings, and so the tabulated results can be used to convert easily between polynomial orderings and more tangible orderings. The orderings of order type are two of the recursive path orderings. All of the other polynomial orderings are of order type w or w2 and each can be expressed as a lexicographic combination of r (weight), A (matrix), and lexicographic (dictionary) orderings. The thesis concludes by showing how the analysis extends to arbitrary monadic terms, and discussing possible developments for the future

    Simplification orders in term rewriting

    Full text link
    Thema der Arbeit ist die Anwendung von Methoden der Beweistheorie auf Termersetzungssysteme, deren Termination mittels einer Simplifikationsordnung beweisbar ist. Es werden optimale Schranken für Herleitungslängen im allgemeinen Fall und im Fall der Termination mittels einer Knuth-Bendix-Ordnung (KBO) angegeben. Zudem werden die Ordnungstypen von KBOs vollständig klassifiziert und die unter KBO berechenbaren Funktionen vorgestellt. Einen weiteren Schwerpunkt bildet die Untersuchung der Löngen von Reduktionsketten, die bei einfach terminierenden Termersetzungssysteme auftreten und bestimmten Wachstumsbedingungen genügen

    Polynomial Interpretations over the Natural, Rational and Real Numbers Revisited

    Full text link
    Polynomial interpretations are a useful technique for proving termination of term rewrite systems. They come in various flavors: polynomial interpretations with real, rational and integer coefficients. As to their relationship with respect to termination proving power, Lucas managed to prove in 2006 that there are rewrite systems that can be shown polynomially terminating by polynomial interpretations with real (algebraic) coefficients, but cannot be shown polynomially terminating using polynomials with rational coefficients only. He also proved the corresponding statement regarding the use of rational coefficients versus integer coefficients. In this article we extend these results, thereby giving the full picture of the relationship between the aforementioned variants of polynomial interpretations. In particular, we show that polynomial interpretations with real or rational coefficients do not subsume polynomial interpretations with integer coefficients. Our results hold also for incremental termination proofs with polynomial interpretations.Comment: 28 pages; special issue of RTA 201

    The Derivational Complexity Induced by the Dependency Pair Method

    Full text link
    We study the derivational complexity induced by the dependency pair method, enhanced with standard refinements. We obtain upper bounds on the derivational complexity induced by the dependency pair method in terms of the derivational complexity of the base techniques employed. In particular we show that the derivational complexity induced by the dependency pair method based on some direct technique, possibly refined by argument filtering, the usable rules criterion, or dependency graphs, is primitive recursive in the derivational complexity induced by the direct method. This implies that the derivational complexity induced by a standard application of the dependency pair method based on traditional termination orders like KBO, LPO, and MPO is exactly the same as if those orders were applied as the only termination technique

    Proof diagrams and term rewriting with applications to computational algebra

    Get PDF
    In this thesis lessons learned from the use of computer algebra systems and machine assisted theorem provers are developed in order to give an insight into both the problems and their solutions. Many algorithms in computational algebra and automated deduction (for example Grobner basis computations and Knuth-Bendix completion) tend to produce redundant facts and can contain more than one proof of any particular fact. This thesis introduces proof diagrams in order to compare and contrast the proofs of facts which such procedures generate. Proof diagrams make it possible to analyse the effect of heuristics which can be used to guide implementations of such algorithms. An extended version of an inference system for Knuth-Bendix completion is introduced. It is possible to see that this extension characterises the applicability of critical pair criteria, which are heuristics used in completion. We investigate a number of executions of a completion procedure by analysing the associated proof diagrams. This leads to a better understanding of the heuristics used to control these examples. Derived rales of inference are also investigated in this thesis. This is done in the formalism of proof diagrams. Rewrite rules for proof diagrams are defined: this is motivated by the notion of a transformation tactic in the Nuprl proof development system. A method to automatically extract 'useful' derived inference rales is also discussed. 'Off the shelf' theorem provers, such as the Larch Prover and Otter, are compared to specialised programs from computational group theory. This analysis makes it possible to see where methods from automated deduction can improve on the tools which group theorists currently use. Problems which can be attacked with theorem provers but not with currently used specialised programs are also indicated. Tietze transformations, from group theory, are discussed. This makes it possible to link ideas used in Knuth-Bendix completion programs and group presentation simplification programs. Tietze transformations provide heuristics for more efficient and effective implementations of these programs

    Polygraphs: From Rewriting to Higher Categories

    Full text link
    Polygraphs are a higher-dimensional generalization of the notion of directed graph. Based on those as unifying concept, this monograph on polygraphs revisits the theory of rewriting in the context of strict higher categories, adopting the abstract point of view offered by homotopical algebra. The first half explores the theory of polygraphs in low dimensions and its applications to the computation of the coherence of algebraic structures. It is meant to be progressive, with little requirements on the background of the reader, apart from basic category theory, and is illustrated with algorithmic computations on algebraic structures. The second half introduces and studies the general notion of n-polygraph, dealing with the homotopy theory of those. It constructs the folk model structure on the category of strict higher categories and exhibits polygraphs as cofibrant objects. This allows extending to higher dimensional structures the coherence results developed in the first half
    • …
    corecore