19 research outputs found

    Game Refinement Relations and Metrics

    Get PDF
    We consider two-player games played over finite state spaces for an infinite number of rounds. At each state, the players simultaneously choose moves; the moves determine a successor state. It is often advantageous for players to choose probability distributions over moves, rather than single moves. Given a goal, for example, reach a target state, the question of winning is thus a probabilistic one: what is the maximal probability of winning from a given state? On these game structures, two fundamental notions are those of equivalences and metrics. Given a set of winning conditions, two states are equivalent if the players can win the same games with the same probability from both states. Metrics provide a bound on the difference in the probabilities of winning across states, capturing a quantitative notion of state similarity. We introduce equivalences and metrics for two-player game structures, and we show that they characterize the difference in probability of winning games whose goals are expressed in the quantitative mu-calculus. The quantitative mu-calculus can express a large set of goals, including reachability, safety, and omega-regular properties. Thus, we claim that our relations and metrics provide the canonical extensions to games, of the classical notion of bisimulation for transition systems. We develop our results both for equivalences and metrics, which generalize bisimulation, and for asymmetrical versions, which generalize simulation

    The Spectrum of Strong Behavioral Equivalences for Nondeterministic and Probabilistic Processes

    Full text link
    We present a spectrum of trace-based, testing, and bisimulation equivalences for nondeterministic and probabilistic processes whose activities are all observable. For every equivalence under study, we examine the discriminating power of three variants stemming from three approaches that differ for the way probabilities of events are compared when nondeterministic choices are resolved via deterministic schedulers. We show that the first approach - which compares two resolutions relatively to the probability distributions of all considered events - results in a fragment of the spectrum compatible with the spectrum of behavioral equivalences for fully probabilistic processes. In contrast, the second approach - which compares the probabilities of the events of a resolution with the probabilities of the same events in possibly different resolutions - gives rise to another fragment composed of coarser equivalences that exhibits several analogies with the spectrum of behavioral equivalences for fully nondeterministic processes. Finally, the third approach - which only compares the extremal probabilities of each event stemming from the different resolutions - yields even coarser equivalences that, however, give rise to a hierarchy similar to that stemming from the second approach.Comment: In Proceedings QAPL 2013, arXiv:1306.241

    Interface Simulation Distances

    Full text link
    The classical (boolean) notion of refinement for behavioral interfaces of system components is the alternating refinement preorder. In this paper, we define a distance for interfaces, called interface simulation distance. It makes the alternating refinement preorder quantitative by, intuitively, tolerating errors (while counting them) in the alternating simulation game. We show that the interface simulation distance satisfies the triangle inequality, that the distance between two interfaces does not increase under parallel composition with a third interface, and that the distance between two interfaces can be bounded from above and below by distances between abstractions of the two interfaces. We illustrate the framework, and the properties of the distances under composition of interfaces, with two case studies.Comment: In Proceedings GandALF 2012, arXiv:1210.202

    The earlier the better: a theory of timed actor interfaces

    Get PDF
    Programming embedded and cyber-physical systems requires attention not only to functional behavior and correctness, but also to non-functional aspects and specifically timing and performance. A structured, compositional, model-based approach based on stepwise refinement and abstraction techniques can support the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. Toward this, we introduce a theory of timed actors whose notion of refinement is based on the principle of worst-case design that permeates the world of performance-critical systems. This is in contrast with the classical behavioral and functional refinements based on restricting sets of behaviors. Our refinement allows time-deterministic abstractions to be made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis. We show how our theory relates to, and can be used to reconcile existing time and performance models and their established theories

    Weighted Modal Transition Systems

    Get PDF
    Specification theories as a tool in model-driven development processes of component-based software systems have recently attracted a considerable attention. Current specification theories are however qualitative in nature, and therefore fragile in the sense that the inevitable approximation of systems by models, combined with the fundamental unpredictability of hardware platforms, makes it difficult to transfer conclusions about the behavior, based on models, to the actual system. Hence this approach is arguably unsuited for modern software systems. We propose here the first specification theory which allows to capture quantitative aspects during the refinement and implementation process, thus leveraging the problems of the qualitative setting. Our proposed quantitative specification framework uses weighted modal transition systems as a formal model of specifications. These are labeled transition systems with the additional feature that they can model optional behavior which may or may not be implemented by the system. Satisfaction and refinement is lifted from the well-known qualitative to our quantitative setting, by introducing a notion of distances between weighted modal transition systems. We show that quantitative versions of parallel composition as well as quotient (the dual to parallel composition) inherit the properties from the Boolean setting.Comment: Submitted to Formal Methods in System Desig

    Revisiting bisimilarity and its modal logic for nondeterministic and probabilistic processes

    Get PDF
    We consider PML, the probabilistic version of Hennessy-Milner logic introduced by Larsen and Skou to characterize bisimilarity over probabilistic processes without internal nondeterminism.We provide two different interpretations for PML by considering nondeterministic and probabilistic processes as models, and we exhibit two new bisimulation-based equivalences that are in full agreement with those interpretations. Our new equivalences include as coarsest congruences the two bisimilarities for nondeterministic and probabilistic processes proposed by Segala and Lynch. The latter equivalences are instead in agreement with two versions of Hennessy-Milner logic extended with an additional probabilistic operator interpreted over state distributions rather than over individual states. Thus, our new interpretations of PML and the corresponding new bisimilarities offer a uniform framework for reasoning on processes that are purely nondeterministic or reactive probabilistic or are mixing nondeterminism and probability in an alternating/non-alternating way

    The earlier the better: a theory of timed actor interfaces

    Get PDF
    Programming embedded and cyber-physical systems requires attention not only to functional behavior and correctness, but also to non-functional aspects and specifically timing and performance constraints. A structured, compositional, model-based approach based on stepwise refinement and abstraction techniques can support the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose, we introduce in this paper a general theory of timed actor interfaces. Our theory supports a notion of refinement that is based on the principle of worst-case design that permeates the world of performance-critical systems. This is in contrast with the classical behavioral and functional refinements based on restricting or enlarging sets of behaviors. An important feature of our refinement is that it allows time-deterministic abstractions to be made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis. We also show how our theory relates to, and can be used to reconcile a number of existing time and performance models and how their established theories can be exploited to represent and analyze interface specifications and refinement steps.\u
    corecore