100 research outputs found

    CoLoR: a Coq library on well-founded rewrite relations and its application to the automated verification of termination certificates

    Get PDF
    Termination is an important property of programs; notably required for programs formulated in proof assistants. It is a very active subject of research in the Turing-complete formalism of term rewriting systems, where many methods and tools have been developed over the years to address this problem. Ensuring reliability of those tools is therefore an important issue. In this paper we present a library formalizing important results of the theory of well-founded (rewrite) relations in the proof assistant Coq. We also present its application to the automated verification of termination certificates, as produced by termination tools

    Extending Nunchaku to Dependent Type Theory

    Get PDF
    Nunchaku is a new higher-order counterexample generator based on a sequence of transformations from polymorphic higher-order logic to first-order logic. Unlike its predecessor Nitpick for Isabelle, it is designed as a stand-alone tool, with frontends for various proof assistants. In this short paper, we present some ideas to extend Nunchaku with partial support for dependent types and type classes, to make frontends for Coq and other systems based on dependent type theory more useful.Comment: In Proceedings HaTT 2016, arXiv:1606.0542

    Compiling and securing cryptographic protocols

    Get PDF
    Protocol narrations are widely used in security as semi-formal notations to specify conversations between roles. We define a translation from a protocol narration to the sequences of operations to be performed by each role. Unlike previous works, we reduce this compilation process to well-known decision problems in formal protocol analysis. This allows one to define a natural notion of prudent translation and to reuse many known results from the literature in order to cover more crypto-primitives. In particular this work is the first one to show how to compile protocols parameterised by the properties of the available operations.Comment: A short version was submitted to IP

    A Tour on Ecumenical Systems

    Get PDF
    Ecumenism can be understood as a pursuit of unity, where diverse thoughts, ideas, or points of view coexist harmoniously. In logic, ecumenical systems refer, in a broad sense, to proof systems for combining logics. One captivating area of research over the past few decades has been the exploration of seamlessly merging classical and intuitionistic connectives, allowing them to coexist peacefully. In this paper, we will embark on a journey through ecumenical systems, drawing inspiration from Prawitz' seminal work [35]. We will begin by elucidating Prawitz' concept of “ecumenism” and present a pure sequent calculus version of his system. Building upon this foundation, we will expand our discussion to incorporate alethic modalities, leveraging Simpson's meta-logical characterization. This will enable us to propose several proof systems for ecumenical modal logics. We will conclude our tour with some discussion towards a term calculus proposal for the implicational propositional fragment of the ecumenical logic, the quest of automation using a framework based in rewriting logic, and an ecumenical view of proof-theoretic semantics

    New results on rewrite-based satisfiability procedures

    Full text link
    Program analysis and verification require decision procedures to reason on theories of data structures. Many problems can be reduced to the satisfiability of sets of ground literals in theory T. If a sound and complete inference system for first-order logic is guaranteed to terminate on T-satisfiability problems, any theorem-proving strategy with that system and a fair search plan is a T-satisfiability procedure. We prove termination of a rewrite-based first-order engine on the theories of records, integer offsets, integer offsets modulo and lists. We give a modularity theorem stating sufficient conditions for termination on a combinations of theories, given termination on each. The above theories, as well as others, satisfy these conditions. We introduce several sets of benchmarks on these theories and their combinations, including both parametric synthetic benchmarks to test scalability, and real-world problems to test performances on huge sets of literals. We compare the rewrite-based theorem prover E with the validity checkers CVC and CVC Lite. Contrary to the folklore that a general-purpose prover cannot compete with reasoners with built-in theories, the experiments are overall favorable to the theorem prover, showing that not only the rewriting approach is elegant and conceptually simple, but has important practical implications.Comment: To appear in the ACM Transactions on Computational Logic, 49 page

    Wrapping Computer Algebra is Surprisingly Successful for Non-Linear SMT

    Get PDF
    International audienceWe report on a prototypical tool for Satisfiability Modulo Theory solvingfor quantifier-free formulas in Non-linear Real Arithmetic or, more precisely,real closed fields, which uses a computer algebra system as the main component.This is complemented with two heuristic techniques, also stemming fromcomputer algebra, viz. interval constraint propagation and subtropical satisfiability.Our key idea is to make optimal use of existing knowledge and work in thesymbolic computation community, reusing available methods and implementationsto the most possible extent. Experimental results show that our approach issurprisingly efficient in practice

    Fourteenth Biennial Status Report: MĂ€rz 2017 - February 2019

    No full text

    Deduction modulo theory

    Get PDF
    This paper is a survey on Deduction modulo theor

    Strict General Setting for Building Decision Procedures into Theorem Provers

    Get PDF
    The efficient and flexible incorporating of decision procedures into theorem provers is very important for their successful use. There are several approaches for combining and augmenting of decision procedures; some of them support handling uninterpreted functions, congruence closure, lemma invoking etc. In this paper we present a variant of one general setting for building decision procedures into theorem provers (gs framework [18]). That setting is based on macro inference rules motivated by techniques used in different approaches. The general setting enables a simple describing of different combination/augmentation schemes. In this paper, we further develop and extend this setting by an imposed ordering on the macro inference rules. That ordering leads to a ”strict setting”. It makes implementing and using variants of well-known or new schemes within this framework a very easy task even for a non-expert user. Also, this setting enables easy comparison of different combination/augmentation schemes and combination of their ideas
    • 

    corecore