154,101 research outputs found

    Towards automated verification of Splice in mu mu CRL

    Get PDF
    A considerable fragment of the coordination architecture {sc Splice, including Ethernet, is specified in the process-algebraic language{mumu{sc crl. This specification is used to generate transition systems for a number of simple{sc Splice applications which are verified by model checking using the{sc C{aesar/Ald'{ebaran tool set. For these cases the properties of deadlock freeness, soundness and weak completeness are proven. The primary result reported is a detailed formal model of{sc Splice that makes possible automated verification. In practice, however, it is only for very simple {sc Splice applications feasible to generate a transition system. Nevertheless, model checking applied to a large number of small applications, or scenarios, can be used to gather evidence for the validity of properties that is more general than testing in that it considers all possible system traces for a given scenario instead of just one trace. For applications with a high degree of non-determinism this can be an interesting advantage

    Proof obligations for monomorphicity

    Get PDF
    In certain applications of formal methods to development of correct software one wants the requirement specification to be monomorphic, i.e. that every two term-generated models of it are isomorphic. Consequently, the question arises how to guarantee monomorphicity (which is not decidable in general). In this paper we show that the task of proving monomorphicity of a specification can be reduced to a task of proving certain properties of procedures (with indeterministic constructs). So this task can be directly dealt with in the KIV system (Karlsruhe Interactive Verifier) which was originally designed for software verification. We prove correctness and completeness of our method

    Formal model and policy specification of usage control

    Get PDF
    The recent usage control model (UCON) is a foundation for next-generation access control models with distinguishing properties of decision continuity and attribute mutability. A usage control decision is determined by combining authorizations, obligations, and conditions, presented as UCON ABC core models by Park and Sandhu. Based on these core aspects, we develop a formal model and logical specification of UCON with an extension of Lamport's temporal logic of actions (TLA). The building blocks of this model include: (1) a set of sequences of system states based on the attributes of subjects, objects, and the system, (2) authorization predicates based on subject and object attributes, (3) usage control actions to update attributes and accessing status of a usage process, (4) obligation actions, and (5) condition predicates based on system attributes. A usage control policy is defined as a set of temporal logic formulas that are satisfied as the system state changes. A fixed set of scheme rules is defined to specify general UCON policies with the properties of soundness and completeness. We show the flexibility and expressive capability of this formal model by specifying the core models of UCON and some applications. © 2005 ACM

    Data Minimisation in Communication Protocols: A Formal Analysis Framework and Application to Identity Management

    Full text link
    With the growing amount of personal information exchanged over the Internet, privacy is becoming more and more a concern for users. One of the key principles in protecting privacy is data minimisation. This principle requires that only the minimum amount of information necessary to accomplish a certain goal is collected and processed. "Privacy-enhancing" communication protocols have been proposed to guarantee data minimisation in a wide range of applications. However, currently there is no satisfactory way to assess and compare the privacy they offer in a precise way: existing analyses are either too informal and high-level, or specific for one particular system. In this work, we propose a general formal framework to analyse and compare communication protocols with respect to privacy by data minimisation. Privacy requirements are formalised independent of a particular protocol in terms of the knowledge of (coalitions of) actors in a three-layer model of personal information. These requirements are then verified automatically for particular protocols by computing this knowledge from a description of their communication. We validate our framework in an identity management (IdM) case study. As IdM systems are used more and more to satisfy the increasing need for reliable on-line identification and authentication, privacy is becoming an increasingly critical issue. We use our framework to analyse and compare four identity management systems. Finally, we discuss the completeness and (re)usability of the proposed framework

    EvoAlloy: An Evolutionary Approach For Analyzing Alloy Specifications

    Get PDF
    Using mathematical notations and logical reasoning, formal methods precisely define a program’s specifications, from which we can instantiate valid instances of a system. With these techniques, we can perform a variety of analysis tasks to verify system dependability and rigorously prove the correctness of system properties. While there exist well-designed automated verification tools including ones considered lightweight, they still lack a strong adoption in practice. The essence of the problem is that when applied to large real world applications, they are not scalable and applicable due to the expense of thorough verification process. In this thesis, I present a new approach and demonstrate how to relax the completeness guarantee without much loss, since soundness is maintained. I have extended a widely applied lightweight analysis, Alloy, with a genetic algorithm. Our new tool, EvoAlloy, works at the level of finite relations generated by Kodkod and evolves the chromosomes based on the feedback including failed constraints. Through a feasibility study, I prove that my approach can successfully find solutions to a set of specifications beyond the scope where traditional Alloy Analyzer fails. While EvoAlloy solves small size problems with longer time, its scalability provided by genetic extension shows its potential to handle larger specifications. My future vision is that when specifications are small I can maintain both soundness and completeness, but when this fails, EvoAlloy can switch to its genetic algorithm. Adviser: Hamid Bagher

    Complete Deductive Systems for Probability Logic with Application in Harsanyi Type Spaces

    Get PDF
    Thesis (PhD) - Indiana University, Mathematics, 2007These days, the study of probabilistic systems is very popular not only in theoretical computer science but also in economics. There is a surprising concurrence between game theory and probabilistic programming. J.C. Harsanyi introduced the notion of type spaces to give an implicit description of beliefs in games with incomplete information played by Bayesian players. Type functions on type spaces are the same as the stochastic kernels that are used to interpret probabilistic programs. In addition to this semantic approach to interactive epistemology, a syntactic approach was proposed by R.J. Aumann. It is of foundational importance to develop a deductive logic for his probabilistic belief logic. In the first part of the dissertation, we develop a sound and complete probability logic Σ+\Sigma_+ for type spaces in a formal propositional language with operators LriL_r^i which means ``the agent ii's belief is at least rr" where the index rr is a rational number between 0 and 1. A crucial infinitary inference rule in the system Σ+\Sigma_+ captures the Archimedean property about indices. By the Fourier-Motzkin's elimination method in linear programming, we prove Professor Moss's conjecture that the infinitary rule can be replaced by a finitary one. More importantly, our proof of completeness is in keeping with the Henkin-Kripke style. Also we show through a probabilistic system with parameterized indices that it is decidable whether a formula ϕ\phi is derived from the system Σ+\Sigma_+. The second part is on its strong completeness. It is well-known that Σ+\Sigma_+ is not strongly complete, i.e., a set of formulas in the language may be finitely satisfiable but not necessarily satisfiable. We show that even finitely satisfiable sets of formulas that are closed under the Archimedean rule are not satisfiable. From these results, we develop a theory about probability logic that is parallel to the relationship between explicit and implicit descriptions of belief types in game theory. Moreover, we use a linear system about probabilities over trees to prove that there is no strong completeness even for probability logic with finite indices. We conclude that the lack of strong completeness does not depend on the non-Archimedean property in indices but rather on the use of explicit probabilities in the syntax. We show the completeness and some properties of the probability logic for Harsanyi type spaces. By adding knowledge operators to our languages, we devise a sound and complete axiomatization for Aumann's semantic knowledge-belief systems. Its applications in labeled Markovian processes and semantics for programs are also discussed
    • …
    corecore