10 research outputs found

    On the Complexity of Deciding Behavioural Equivalences and Preorders. A Survey

    Get PDF
    This paper gives an overview of the computational complexity of all the equivalences in the linear/branching time hierarchy [vG90a] and the preordersin the corresponding hierarchy of preorders. We consider finite state or regular processes as well as infinite-state BPA [BK84b] processes. A distinction, which turns out to be important in the finite-state processes, is that of simulation-like equivalences/preorders vs. trace-like equivalencesand preorders. Here we survey various known complexity results for these relations. For regular processes, all simulation-like equivalences and preorders are decidable in polynomial time whereas all trace-like equivalences and preorders are PSPACE-Complete. We also consider interesting specialclasses of regular processes such as deterministic, determinate, unary, locally unary, and tree-like processes and survey the known complexity results inthese special cases. For infinite-state processes the results are quite different. For the class of context-free processes or BPA processes any preorder or equivalence beyond bisimulation is undecidable but bisimulation equivalence is polynomial timedecidable for normed BPA processes and is known to be elementarily decidable in the general case. For the class of BPP processes, all preorders and equivalences apart from bisimilarity are undecidable. However, bisimilarityis decidable in this case and is known to be decidable in polynomial time for normed BPP processes

    Model-Checking Process Equivalences

    Full text link
    Process equivalences are formal methods that relate programs and system which, informally, behave in the same way. Since there is no unique notion of what it means for two dynamic systems to display the same behaviour there are a multitude of formal process equivalences, ranging from bisimulation to trace equivalence, categorised in the linear-time branching-time spectrum. We present a logical framework based on an expressive modal fixpoint logic which is capable of defining many process equivalence relations: for each such equivalence there is a fixed formula which is satisfied by a pair of processes if and only if they are equivalent with respect to this relation. We explain how to do model checking, even symbolically, for a significant fragment of this logic that captures many process equivalences. This allows model checking technology to be used for process equivalence checking. We show how partial evaluation can be used to obtain decision procedures for process equivalences from the generic model checking scheme.Comment: In Proceedings GandALF 2012, arXiv:1210.202

    Techniques for solving Boolean equation systems

    Get PDF
    Boolean equation systems are ordered sequences of Boolean equations decorated with least and greatest fixpoint operators. Boolean equation systems provide a useful framework for formal verification because various specification and verification problems, for instance, μ-calculus model checking can be represented as the problem of solving Boolean equation systems. The general problem of solving a Boolean equation system is a computationally hard task, and no polynomial time solution technique for the problem has been discovered so far. In this thesis, techniques for finding solutions to Boolean equation systems are studied and new methods for solving such systems are devised. The thesis presents a general framework that allows for dividing Boolean equation systems into individual blocks and solving these blocks in isolation with special techniques. Three special techniques are presented, namely: (i) new specialized algorithms for disjunctive and conjunctive form Boolean equation systems, (ii) a new encoding of a general form Boolean equation system into answer set programming, and (iii) new encodings of a general form Boolean equation systems into satisfiability problems. The approaches (ii) and (iii) are motivated by the recent success of answer set programming solvers and satisfiability solvers in formal verification. First, the thesis presents especially fast solution algorithms for disjunctive and conjunctive classes of Boolean equation systems. These special algorithms are useful because many practically relevant model checking problems can be represented as Boolean equation systems that are disjunctive or conjunctive. The new algorithms have been implemented and the performance of the algorithms has been compared experimentally on communication protocol verification examples. Second, the thesis gives a translation of the problem of solving a general form Boolean equation system into the problem of finding a stable model of a logic program. The translation allows to use implementations of answer set programming solvers to solve Boolean equation systems. Experimental tests have been performed using the presented approach and these experiments indicate the usefulness of answer set programming in this problem domain. Third, the thesis presents reductions from the problem of solving general form Boolean equation systems to the satisfiability problems of difference logic and propositional logic. The reductions allow to use implementations of satisfiability solvers to solve Boolean equation systems. The presented reductions have been implemented and it is shown via experiments that the new approach leads to practically efficient methods to solve general Boolean equation systems.Boolen yhtälöryhmät ovat kiintopisteoperaattoreilla varustettuja Boolen yhtälöitä. Boolen yhtälöryhmät luovat hyödyllisen viitekehyksen tietokoneavusteiselle verifioinnille, sillä monet määrittely- ja verifiointiongelmat voidaan kuvata tällaisten kiintopisteyhtälöiden avulla. Työssä kehitetään uusia menetelmiä Boolen yhtälöryhmien ratkaisemiseen. Työssä esitetään yleinen viitekehys Boolen yhtälöryhmien ratkaisemiseen, joka yksinkertaistaa ratkaisun laskemista jakamalla yhtälöryhmät yksinkertaisempiin aliongelmiin. Työssä esitetään kolme uutta mentelmää Boolen yhtälöryhmien ratkaisemiseen. Konjunktiivisten ja disjunktiivisten Boolen yhtälöryhmien ratkaisemiseen kehitetään uusia algoritmeja, sekä esitetään näiden toteutukset ja suorituskykyjä koskevia koetuloksia. Työssä kehitetään käännös Boolen yhtälöryhmän ratkaisemisesta logiikkaohjelman stabiilin mallin löytämiseen sekä menetelmän toimivuutta koskevia koetuloksia. Käännös mahdollistaa logiikkaohjelmointiympäristöjen toteutusten käytön Boolen yhtälöryhmien ratkaisemiseen. Koetulokset osoittavat rajoitepohjaisen logiikkaohjelmointiympäristön tehokkuuden Boolen yhtälöryhmien ratkaisemisessa. Työssä kehitetään myös käännökset Boolen yhtälöryhmän ratkaisemisesta differenssilogiikan sekä lauselogiikan toteutuvuusongelmiin. Käännökset mahdollistavat toteutuvuustarkastimien käytön Boolen yhtälöryhmien ratkaisemiseen. Koetulokset osoittavat esitettyjen menetelmien tehokkuuden Boolen yhtälöryhmien ratkaisemisessa.reviewe

    Abstract Dependency Graphs for Model Verification

    Get PDF

    Tools and Algorithms for the Construction and Analysis of Systems

    Get PDF
    This open access two-volume set constitutes the proceedings of the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2021, which was held during March 27 – April 1, 2021, as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2021. The conference was planned to take place in Luxembourg and changed to an online format due to the COVID-19 pandemic. The total of 41 full papers presented in the proceedings was carefully reviewed and selected from 141 submissions. The volume also contains 7 tool papers; 6 Tool Demo papers, 9 SV-Comp Competition Papers. The papers are organized in topical sections as follows: Part I: Game Theory; SMT Verification; Probabilities; Timed Systems; Neural Networks; Analysis of Network Communication. Part II: Verification Techniques (not SMT); Case Studies; Proof Generation/Validation; Tool Papers; Tool Demo Papers; SV-Comp Tool Competition Papers

    Pseudo-contractions as Gentle Repairs

    Get PDF
    Updating a knowledge base to remove an unwanted consequence is a challenging task. Some of the original sentences must be either deleted or weakened in such a way that the sentence to be removed is no longer entailed by the resulting set. On the other hand, it is desirable that the existing knowledge be preserved as much as possible, minimising the loss of information. Several approaches to this problem can be found in the literature. In particular, when the knowledge is represented by an ontology, two different families of frameworks have been developed in the literature in the past decades with numerous ideas in common but with little interaction between the communities: applications of AGM-like Belief Change and justification-based Ontology Repair. In this paper, we investigate the relationship between pseudo-contraction operations and gentle repairs. Both aim to avoid the complete deletion of sentences when replacing them with weaker versions is enough to prevent the entailment of the unwanted formula. We show the correspondence between concepts on both sides and investigate under which conditions they are equivalent. Furthermore, we propose a unified notation for the two approaches, which might contribute to the integration of the two areas

    Les preuves de protocoles cryprographiques revisitées

    Get PDF
    With the rise of the Internet the use of cryptographic protocols became ubiquitous. Considering the criticality and complexity of these protocols, there is an important need of formal verification.In order to obtain formal proofs of cryptographic protocols, two main attacker models exist: the symbolic model and the computational model. The symbolic model defines the attacker capabilities as a fixed set of rules. On the other hand, the computational model describes only the attacker's limitations by stating that it may break some hard problems. While the former is quiteabstract and convenient for automating proofs the later offers much stronger guarantees.There is a gap between the guarantees offered by these two models due to the fact the symbolic model defines what the adversary may do while the computational model describes what it may not do. In 2012 Bana and Comon devised a new symbolic model in which the attacker's limitations are axiomatised. In addition provided that the (computational semantics) of the axioms follows from the cryptographic hypotheses, proving security in this symbolic model yields security in the computational model.The possibility of automating proofs in this model (and finding axioms general enough to prove a large class of protocols) was left open in the original paper. In this thesis we provide with an efficient decision procedure for a general class of axioms. In addition we propose a tool (SCARY) implementing this decision procedure. Experimental results of our tool shows that the axioms we designed for modelling security of encryption are general enough to prove a large class of protocols.Avec la généralisation d'Internet, l'usage des protocoles cryptographiques est devenu omniprésent. Étant donné leur complexité et leur l'aspect critique, une vérification formelle des protocoles cryptographiques est nécessaire.Deux principaux modèles existent pour prouver les protocoles. Le modèle symbolique définit les capacités de l'attaquant comme un ensemble fixe de règles, tandis que le modèle calculatoire interdit seulement a l'attaquant derésoudre certain problèmes difficiles. Le modèle symbolique est très abstrait et permet généralement d'automatiser les preuves, tandis que le modèle calculatoire fournit des garanties plus fortes.Le fossé entre les garanties offertes par ces deux modèles est dû au fait que le modèle symbolique décrit les capacités de l'adversaire alors que le modèle calculatoire décrit ses limitations. En 2012 Bana et Comon ont proposé unnouveau modèle symbolique dans lequel les limitations de l'attaquant sont axiomatisées. De plus, si la sémantique calculatoire des axiomes découle des hypothèses cryptographiques, la sécurité dans ce modèle symbolique fournit desgaranties calculatoires.L'automatisation des preuves dans ce nouveau modèle (et l'élaboration d'axiomes suffisamment généraux pour prouver un grand nombre de protocoles) est une question laissée ouverte par l'article de Bana et Comon. Dans cette thèse nous proposons une procédure de décision efficace pour une large classe d'axiomes. De plus nous avons implémenté cette procédure dans un outil (SCARY). Nos résultats expérimentaux montrent que nos axiomes modélisant la sécurité du chiffrement sont suffisamment généraux pour prouver une large classe de protocoles

    Computational complexity theory and the philosophy of mathematics

    Get PDF
    Computational complexity theory is a subfield of computer science originating in computability theory and the study of algorithms for solving practical mathematical problems. Amongst its aims is classifying problems by their degree of difficulty — i.e., how hard they are to solve computationally. This paper highlights the significance of complexity theory relative to questions traditionally asked by philosophers of mathematics while also attempting to isolate some new ones — e.g., about the notion of feasibility in mathematics, the P≠NP problem and why it has proven hard to resolve, and the role of non-classical modes of computation and proof

    Annales Mathematicae et Informaticae 2008

    Get PDF
    corecore