70 research outputs found

    A Divergence Critic for Inductive Proof

    Full text link
    Inductive theorem provers often diverge. This paper describes a simple critic, a computer program which monitors the construction of inductive proofs attempting to identify diverging proof attempts. Divergence is recognized by means of a ``difference matching'' procedure. The critic then proposes lemmas and generalizations which ``ripple'' these differences away so that the proof can go through without divergence. The critic enables the theorem prover Spike to prove many theorems completely automatically from the definitions alone.Comment: See http://www.jair.org/ for any accompanying file

    Proof diagrams and term rewriting with applications to computational algebra

    Get PDF
    In this thesis lessons learned from the use of computer algebra systems and machine assisted theorem provers are developed in order to give an insight into both the problems and their solutions. Many algorithms in computational algebra and automated deduction (for example Grobner basis computations and Knuth-Bendix completion) tend to produce redundant facts and can contain more than one proof of any particular fact. This thesis introduces proof diagrams in order to compare and contrast the proofs of facts which such procedures generate. Proof diagrams make it possible to analyse the effect of heuristics which can be used to guide implementations of such algorithms. An extended version of an inference system for Knuth-Bendix completion is introduced. It is possible to see that this extension characterises the applicability of critical pair criteria, which are heuristics used in completion. We investigate a number of executions of a completion procedure by analysing the associated proof diagrams. This leads to a better understanding of the heuristics used to control these examples. Derived rales of inference are also investigated in this thesis. This is done in the formalism of proof diagrams. Rewrite rules for proof diagrams are defined: this is motivated by the notion of a transformation tactic in the Nuprl proof development system. A method to automatically extract 'useful' derived inference rales is also discussed. 'Off the shelf' theorem provers, such as the Larch Prover and Otter, are compared to specialised programs from computational group theory. This analysis makes it possible to see where methods from automated deduction can improve on the tools which group theorists currently use. Problems which can be attacked with theorem provers but not with currently used specialised programs are also indicated. Tietze transformations, from group theory, are discussed. This makes it possible to link ideas used in Knuth-Bendix completion programs and group presentation simplification programs. Tietze transformations provide heuristics for more efficient and effective implementations of these programs

    Solving Equality Reasoning Problems with a Connection Graph Theorem Prover

    Get PDF
    The integration of a Knuth-Bendix completion algorithm into a paramodulation theorem prover on the basis of a connection graph resolution procedure is presented. The Knuth-Bendix completion idea is compared to a decomposition approach, and some ideas to handle conditional equations are discussed. The contents of this paper is not intended to present new material on term rewriting, instead it is more a pleading for the usage of completion ideas in automated deduction. It records our experience with an actual implementation of a hybrid system, where a completion procedure was imbedded into a connection graph theorem prover, the MKRP-system, with satisfactory positive results

    REST: Integrating Term Rewriting with Program Verification (Extended Version)

    Get PDF
    We introduce REST, a novel term rewriting technique for theorem proving that uses online termination checking and can be integrated with existing program verifiers. REST enables flexible but terminating term rewriting for theorem proving by: (1) exploiting newly-introduced term orderings that are more permissive than standard rewrite simplification orderings; (2) dynamically and iteratively selecting orderings based on the path of rewrites taken so far; and (3) integrating external oracles that allow steps that cannot be justified with rewrite rules. Our REST approach is designed around an easily implementable core algorithm, parameterizable by choices of term orderings and their implementations; in this way our approach can be easily integrated into existing tools. We implemented REST as a Haskell library and incorporated it into Liquid Haskell's evaluation strategy, extending Liquid Haskell with rewriting rules. We evaluated our REST implementation by comparing it against both existing rewriting techniques and E-matching and by showing that it can be used to supplant manual lemma application in many existing Liquid Haskell proofs

    Reduction relations for monoid semirings

    Get PDF
    AbstractIn this paper we study rewriting techniques for monoid semirings. Based on disjoint and non-disjoint representations of the elements of monoid semirings we define two different reduction relations. We prove that in both cases the reduction relation describes the congruence that is induced by the underlying set of equations, and we study the termination and confluence properties of the reduction relations

    Effective termination techniques

    Get PDF
    An important property of term rewriting systems is termination: the guarantee that every rewrite sequence is finite. This thesis is concerned with orderings used for proving termination, in particular the Knuth-Bendix and polynomial orderings. First, two methods for generating termination orderings are enhanced. The Knuth-Bendix ordering algorithm incrementally generates numeric and symbolic constraints that are sufficient for the termination of the rewrite system being constructed. The KB ordering algorithm requires an efficient linear constraint solver that detects the nature of degeneracy in the solution space, and for this a revised method of complete description is presented that eliminates the space redundancy that crippled previous implementations. Polynomial orderings are more powerful than Knuth-Bendix orderings, but are usually much harder to generate. Rewrite systems consisting of only a handful of rules can overwhelm existing search techniques due to the combinatorial complexity. A genetic algorithm is applied with some success. Second, a subset of the family of polynomial orderings is analysed. The polynomial orderings on terms in two unary function symbols are fully resolved into simpler orderings. Thus it is shown that most of the complexity of polynomial orderings is redundant. The order type (logical invariant), either r or A (numeric invariant), and precedence is calculated for each polynomial ordering. The invariants correspond in a natural way to the parameters of the orderings, and so the tabulated results can be used to convert easily between polynomial orderings and more tangible orderings. The orderings of order type are two of the recursive path orderings. All of the other polynomial orderings are of order type w or w2 and each can be expressed as a lexicographic combination of r (weight), A (matrix), and lexicographic (dictionary) orderings. The thesis concludes by showing how the analysis extends to arbitrary monadic terms, and discussing possible developments for the future

    Verification techniques for LOTOS

    Full text link
    • …
    corecore