491 research outputs found

    Equational Characterization of Covariant-Contravariant Simulation and Conformance Simulation Semantics

    Get PDF
    Covariant-contravariant simulation and conformance simulation generalize plain simulation and try to capture the fact that it is not always the case that "the larger the number of behaviors, the better". We have previously studied their logical characterizations and in this paper we present the axiomatizations of the preorders defined by the new simulation relations and their induced equivalences. The interest of our results lies in the fact that the axiomatizations help us to know the new simulations better, understanding in particular the role of the contravariant characteristics and their interplay with the covariant ones; moreover, the axiomatizations provide us with a powerful tool to (algebraically) prove results of the corresponding semantics. But we also consider our results interesting from a metatheoretical point of view: the fact that the covariant-contravariant simulation equivalence is indeed ground axiomatizable when there is no action that exhibits both a covariant and a contravariant behaviour, but becomes non-axiomatizable whenever we have together actions of that kind and either covariant or contravariant actions, offers us a new subtle example of the narrow border separating axiomatizable and non-axiomatizable semantics. We expect that by studying these examples we will be able to develop a general theory separating axiomatizable and non-axiomatizable semantics.Comment: In Proceedings SOS 2010, arXiv:1008.190

    Dynamic Congruence vs. Progressing Bisimulation for CCS

    No full text
    Weak Observational Congruence (woc) defined on CCS agents is not a bisimulation since it does not require two states reached by bisimilar computations of woc agents to be still woc, e.g. \alpha.\tau.\beta.nil and \alpha.\beta.nil are woc but \tau.\beta.nil and \beta.nil are not. This fact prevent us from characterizing CCS semantics (when \tau is considered invisible) as a final algebra, since the semantic function would induce an equivalence over the agents that is both a congruence and a bisimulation. In the paper we introduce a new behavioural equivalence for CCS agents, which is the coarsest among those bisimulations which are also congruences. We call it Dynamic Observational Congruence because it expresses a natural notion of equivalence for concurrent systems required to simulate each other in the presence of dynamic, i.e. run time, (re)configurations. We provide an algebraic characterization of Dynamic Congruence in terms of a universal property of finality. Furthermore we introduce Progressing Bisimulation, which forces processes to simulate each other performing explicit steps. We provide an algebraic characterization of it in terms of finality, two logical characterizations via modal logic in the style of HML and a complete axiomatization for finite agents (consisting of the axioms for Strong Observational Congruence and of two of the three Milner's Ď„\tau-laws). Finally, we prove that Dynamic Congruence and Progressing Bisimulation coincide for CCS agents

    Graphical Reasoning in Compact Closed Categories for Quantum Computation

    Full text link
    Compact closed categories provide a foundational formalism for a variety of important domains, including quantum computation. These categories have a natural visualisation as a form of graphs. We present a formalism for equational reasoning about such graphs and develop this into a generic proof system with a fixed logical kernel for equational reasoning about compact closed categories. Automating this reasoning process is motivated by the slow and error prone nature of manual graph manipulation. A salient feature of our system is that it provides a formal and declarative account of derived results that can include `ellipses'-style notation. We illustrate the framework by instantiating it for a graphical language of quantum computation and show how this can be used to perform symbolic computation.Comment: 21 pages, 9 figures. This is the journal version of the paper published at AIS

    A Survey of Symbolic Methods in Computational Analysis of Cryptographic Systems

    Get PDF
    Since the 1980s, two approaches have been developed for analyzing security protocols. One of the approaches relies on a computational model that considers issues of complexity and probability. This approach captures a strong notion of security, guaranteed against all probabilistic polynomial-time attacks. The other approach relies on a symbolic model of protocol executions in which cryptographic primitives are treated as black boxes. Since the seminal work of Dolev and Yao, it has been realized that this latter approach enables significantly simpler and often automated proofs. However, the guarantees that it offers have been quite unclear. For more than twenty years the two approaches have coexisted but evolved mostly independently. Recently, significant research efforts attempt to develop paradigms for cryptographic systems analysis that combines the best of both worlds. There are two broad directions that have been followed. {\em Computational soundness} aims to establish sufficient conditions under which results obtained using symbolic models imply security under computational models. The {\em direct approach} aims to apply the principles and the techniques developed in the context of symbolic models directly to computational ones. In this paper we survey existing results along both of these directions. Our goal is to provide a rather complete summary that could act as a quick reference for researchers who want to contribute to the field, want to make use of existing results, or just want to get a better picture of what results already exist

    Quantum Picturalism

    Full text link
    The quantum mechanical formalism doesn't support our intuition, nor does it elucidate the key concepts that govern the behaviour of the entities that are subject to the laws of quantum physics. The arrays of complex numbers are kin to the arrays of 0s and 1s of the early days of computer programming practice. In this review we present steps towards a diagrammatic `high-level' alternative for the Hilbert space formalism, one which appeals to our intuition. It allows for intuitive reasoning about interacting quantum systems, and trivialises many otherwise involved and tedious computations. It clearly exposes limitations such as the no-cloning theorem, and phenomena such as quantum teleportation. As a logic, it supports `automation'. It allows for a wider variety of underlying theories, and can be easily modified, having the potential to provide the required step-stone towards a deeper conceptual understanding of quantum theory, as well as its unification with other physical theories. Specific applications discussed here are purely diagrammatic proofs of several quantum computational schemes, as well as an analysis of the structural origin of quantum non-locality. The underlying mathematical foundation of this high-level diagrammatic formalism relies on so-called monoidal categories, a product of a fairly recent development in mathematics. These monoidal categories do not only provide a natural foundation for physical theories, but also for proof theory, logic, programming languages, biology, cooking, ... The challenge is to discover the necessary additional pieces of structure that allow us to predict genuine quantum phenomena.Comment: Commissioned paper for Contemporary Physics, 31 pages, 84 pictures, some colo

    Key Substitution in the Symbolic Analysis of Cryptographic Protocols (extended version)

    Get PDF
    Key substitution vulnerable signature schemes are signature schemes that permit an intruder, given a public verification key and a signed message, to compute a pair of signature and verification keys such that the message appears to be signed with the new signature key. A digital signature scheme is said to be vulnerable to destructive exclusive ownership property (DEO) If it is computationaly feasible for an intruder, given a public verification key and a pair of message and its valid signature relatively to the given public key, to compute a pair of signature and verification keys and a new message such that the given signature appears to be valid for the new message relatively to the new verification key. In this paper, we prove decidability of the insecurity problem of cryptographic protocols where the signature schemes employed in the concrete realisation have this two properties

    Contradiction-tolerant process algebra with propositional signals

    Full text link
    In a previous paper, an ACP-style process algebra was proposed in which propositions are used as the visible part of the state of processes and as state conditions under which processes may proceed. This process algebra, called ACPps, is built on classical propositional logic. In this paper, we present a version of ACPps built on a paraconsistent propositional logic which is essentially the same as CLuNs. There are many systems that would have to deal with self-contradictory states if no special measures were taken. For a number of these systems, it is conceivable that accepting self-contradictory states and dealing with them in a way based on a paraconsistent logic is an alternative to taking special measures. The presented version of ACPps can be suited for the description and analysis of systems that deal with self-contradictory states in a way based on the above-mentioned paraconsistent logic.Comment: 25 pages; 26 pages, occurrences of wrong symbol for bisimulation equivalence replaced; 26 pages, Proposition 1 added; 27 pages, explanation of the phrase 'in contradiction' added to section 2 and presentation of the completeness result in section 2 improved; 27 pages, uniqueness result in section 2 revised; 27 pages, last paragraph of section 8 revise

    Relating two standard notions of secrecy

    Get PDF
    Two styles of definitions are usually considered to express that a security protocol preserves the confidentiality of a data s. Reachability-based secrecy means that s should never be disclosed while equivalence-based secrecy states that two executions of a protocol with distinct instances for s should be indistinguishable to an attacker. Although the second formulation ensures a higher level of security and is closer to cryptographic notions of secrecy, decidability results and automatic tools have mainly focused on the first definition so far. This paper initiates a systematic investigation of the situations where syntactic secrecy entails strong secrecy. We show that in the passive case, reachability-based secrecy actually implies equivalence-based secrecy for digital signatures, symmetric and asymmetric encryption provided that the primitives are probabilistic. For active adversaries, we provide sufficient (and rather tight) conditions on the protocol for this implication to hold.Comment: 29 pages, published in LMC

    No entailing laws, but enablement in the evolution of the biosphere

    Get PDF
    Biological evolution is a complex blend of ever changing structural stability, variability and emergence of new phenotypes, niches, ecosystems. We wish to argue that the evolution of life marks the end of a physics world view of law entailed dynamics. Our considerations depend upon discussing the variability of the very "contexts of life": the interactions between organisms, biological niches and ecosystems. These are ever changing, intrinsically indeterminate and even unprestatable: we do not know ahead of time the "niches" which constitute the boundary conditions on selection. More generally, by the mathematical unprestatability of the "phase space" (space of possibilities), no laws of motion can be formulated for evolution. We call this radical emergence, from life to life. The purpose of this paper is the integration of variation and diversity in a sound conceptual frame and situate unpredictability at a novel theoretical level, that of the very phase space. Our argument will be carried on in close comparisons with physics and the mathematical constructions of phase spaces in that discipline. The role of (theoretical) symmetries as invariant preserving transformations will allow us to understand the nature of physical phase spaces and to stress the differences required for a sound biological theoretizing. In this frame, we discuss the novel notion of "enablement". This will restrict causal analyses to differential cases (a difference that causes a difference). Mutations or other causal differences will allow us to stress that "non conservation principles" are at the core of evolution, in contrast to physical dynamics, largely based on conservation principles as symmetries. Critical transitions, the main locus of symmetry changes in physics, will be discussed, and lead to "extended criticality" as a conceptual frame for a better understanding of the living state of matter

    A Distribution Law for CCS and a New Congruence Result for the pi-calculus

    Get PDF
    We give an axiomatisation of strong bisimilarity on a small fragment of CCS that does not feature the sum operator. This axiomatisation is then used to derive congruence of strong bisimilarity in the finite pi-calculus in absence of sum. To our knowledge, this is the only nontrivial subcalculus of the pi-calculus that includes the full output prefix and for which strong bisimilarity is a congruence.Comment: 20 page
    • …
    corecore