11,408 research outputs found

    Homeomorphic Embedding for Online Termination of Symbolic Methods

    No full text
    Well-quasi orders in general, and homeomorphic embedding in particular, have gained popularity to ensure the termination of techniques for program analysis, specialisation, transformation, and verification. In this paper we survey and discuss this use of homeomorphic embedding and clarify the advantages of such an approach over one using well-founded orders. We also discuss various extensions of the homeomorphic embedding relation. We conclude with a study of homeomorphic embedding in the context of metaprogramming, presenting some new (positive and negative) results and open problems

    Proving Correctness of Imperative Programs by Linearizing Constrained Horn Clauses

    Full text link
    We present a method for verifying the correctness of imperative programs which is based on the automated transformation of their specifications. Given a program prog, we consider a partial correctness specification of the form {φ}\{\varphi\} prog {ψ}\{\psi\}, where the assertions φ\varphi and ψ\psi are predicates defined by a set Spec of possibly recursive Horn clauses with linear arithmetic (LA) constraints in their premise (also called constrained Horn clauses). The verification method consists in constructing a set PC of constrained Horn clauses whose satisfiability implies that {φ}\{\varphi\} prog {ψ}\{\psi\} is valid. We highlight some limitations of state-of-the-art constrained Horn clause solving methods, here called LA-solving methods, which prove the satisfiability of the clauses by looking for linear arithmetic interpretations of the predicates. In particular, we prove that there exist some specifications that cannot be proved valid by any of those LA-solving methods. These specifications require the proof of satisfiability of a set PC of constrained Horn clauses that contain nonlinear clauses (that is, clauses with more than one atom in their premise). Then, we present a transformation, called linearization, that converts PC into a set of linear clauses (that is, clauses with at most one atom in their premise). We show that several specifications that could not be proved valid by LA-solving methods, can be proved valid after linearization. We also present a strategy for performing linearization in an automatic way and we report on some experimental results obtained by using a preliminary implementation of our method.Comment: To appear in Theory and Practice of Logic Programming (TPLP), Proceedings of ICLP 201

    A System for Deduction-based Formal Verification of Workflow-oriented Software Models

    Full text link
    The work concerns formal verification of workflow-oriented software models using deductive approach. The formal correctness of a model's behaviour is considered. Manually building logical specifications, which are considered as a set of temporal logic formulas, seems to be the significant obstacle for an inexperienced user when applying the deductive approach. A system, and its architecture, for the deduction-based verification of workflow-oriented models is proposed. The process of inference is based on the semantic tableaux method which has some advantages when compared to traditional deduction strategies. The algorithm for an automatic generation of logical specifications is proposed. The generation procedure is based on the predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea for the approach is to consider patterns, defined in terms of temporal logic,as a kind of (logical) primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between intuitiveness of the deductive reasoning and the difficulty of its practical application in the case when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing our understanding of, the deduction-based formal verification of workflow-oriented models.Comment: International Journal of Applied Mathematics and Computer Scienc

    Automating Security Analysis: Symbolic Equivalence of Constraint Systems

    Get PDF
    We consider security properties of cryptographic protocols, that are either trace properties (such as confidentiality or authenticity) or equivalence properties (such as anonymity or strong secrecy). Infinite sets of possible traces are symbolically represented using deducibility constraints. We give a new algorithm that decides the trace equivalence for the traces that are represented using such constraints, in the case of signatures, symmetric and asymmetric encryptions. Our algorithm is implemented and performs well on typical benchmarks. This is the first implemented algorithm, deciding symbolic trace equivalence

    Thread-Modular Static Analysis for Relaxed Memory Models

    Full text link
    We propose a memory-model-aware static program analysis method for accurately analyzing the behavior of concurrent software running on processors with weak consistency models such as x86-TSO, SPARC-PSO, and SPARC-RMO. At the center of our method is a unified framework for deciding the feasibility of inter-thread interferences to avoid propagating spurious data flows during static analysis and thus boost the performance of the static analyzer. We formulate the checking of interference feasibility as a set of Datalog rules which are both efficiently solvable and general enough to capture a range of hardware-level memory models. Compared to existing techniques, our method can significantly reduce the number of bogus alarms as well as unsound proofs. We implemented the method and evaluated it on a large set of multithreaded C programs. Our experiments showthe method significantly outperforms state-of-the-art techniques in terms of accuracy with only moderate run-time overhead.Comment: revised version of the ESEC/FSE 2017 pape

    A Polyvariant Binding-Time Analysis for Off-line Partial Deduction

    Full text link
    We study the notion of binding-time analysis for logic programs. We formalise the unfolding aspect of an on-line partial deduction system as a Prolog program. Using abstract interpretation, we collect information about the run-time behaviour of the program. We use this information to make the control decisions about the unfolding at analysis time and to turn the on-line system into an off-line system. We report on some initial experiments.Comment: 19 pages (including appendix) Paper (without appendix) appeared in Programming Languages and Systems, Proceedings of the European Symposium on Programming (ESOP'98), Part of ETAPS'98 (Chris Hankin, eds.), LNCS, vol. 1381, 1998, pp. 27-4

    The Ecce and Logen Partial Evaluators and their Web Interfaces

    No full text
    We present Ecce and Logen, two partial evaluators for Prolog using the online and offline approach respectively. We briefly present the foundations of these tools and discuss various applications. We also present new implementations of these tools, carried out in Ciao Prolog. In addition to a command-line interface new user-friendly web interfaces were developed. These enable non-expert users to specialise logic programs using a web browser, without the need for a local installation
    corecore