1,345 research outputs found

    Computing Nash Equilibrium in Wireless Ad Hoc Networks: A Simulation-Based Approach

    Full text link
    This paper studies the problem of computing Nash equilibrium in wireless networks modeled by Weighted Timed Automata. Such formalism comes together with a logic that can be used to describe complex features such as timed energy constraints. Our contribution is a method for solving this problem using Statistical Model Checking. The method has been implemented in UPPAAL model checker and has been applied to the analysis of Aloha CSMA/CD and IEEE 802.15.4 CSMA/CA protocols.Comment: In Proceedings IWIGP 2012, arXiv:1202.422

    LIPIcs

    Get PDF
    The monitoring of event frequencies can be used to recognize behavioral anomalies, to identify trends, and to deduce or discard hypotheses about the underlying system. For example, the performance of a web server may be monitored based on the ratio of the total count of requests from the least and most active clients. Exact frequency monitoring, however, can be prohibitively expensive; in the above example it would require as many counters as there are clients. In this paper, we propose the efficient probabilistic monitoring of common frequency properties, including the mode (i.e., the most common event) and the median of an event sequence. We define a logic to express composite frequency properties as a combination of atomic frequency properties. Our main contribution is an algorithm that, under suitable probabilistic assumptions, can be used to monitor these important frequency properties with four counters, independent of the number of different events. Our algorithm samples longer and longer subwords of an infinite event sequence. We prove the almost-sure convergence of our algorithm by generalizing ergodic theory from increasing-length prefixes to increasing-length subwords of an infinite sequence. A similar algorithm could be used to learn a connected Markov chain of a given structure from observing its outputs, to arbitrary precision, for a given confidence

    Probabilistic Model Checking for Energy Analysis in Software Product Lines

    Full text link
    In a software product line (SPL), a collection of software products is defined by their commonalities in terms of features rather than explicitly specifying all products one-by-one. Several verification techniques were adapted to establish temporal properties of SPLs. Symbolic and family-based model checking have been proven to be successful for tackling the combinatorial blow-up arising when reasoning about several feature combinations. However, most formal verification approaches for SPLs presented in the literature focus on the static SPLs, where the features of a product are fixed and cannot be changed during runtime. This is in contrast to dynamic SPLs, allowing to adapt feature combinations of a product dynamically after deployment. The main contribution of the paper is a compositional modeling framework for dynamic SPLs, which supports probabilistic and nondeterministic choices and allows for quantitative analysis. We specify the feature changes during runtime within an automata-based coordination component, enabling to reason over strategies how to trigger dynamic feature changes for optimizing various quantitative objectives, e.g., energy or monetary costs and reliability. For our framework there is a natural and conceptually simple translation into the input language of the prominent probabilistic model checker PRISM. This facilitates the application of PRISM's powerful symbolic engine to the operational behavior of dynamic SPLs and their family-based analysis against various quantitative queries. We demonstrate feasibility of our approach by a case study issuing an energy-aware bonding network device.Comment: 14 pages, 11 figure

    Tableaux for Policy Synthesis for MDPs with PCTL* Constraints

    Full text link
    Markov decision processes (MDPs) are the standard formalism for modelling sequential decision making in stochastic environments. Policy synthesis addresses the problem of how to control or limit the decisions an agent makes so that a given specification is met. In this paper we consider PCTL*, the probabilistic counterpart of CTL*, as the specification language. Because in general the policy synthesis problem for PCTL* is undecidable, we restrict to policies whose execution history memory is finitely bounded a priori. Surprisingly, no algorithm for policy synthesis for this natural and expressive framework has been developed so far. We close this gap and describe a tableau-based algorithm that, given an MDP and a PCTL* specification, derives in a non-deterministic way a system of (possibly nonlinear) equalities and inequalities. The solutions of this system, if any, describe the desired (stochastic) policies. Our main result in this paper is the correctness of our method, i.e., soundness, completeness and termination.Comment: This is a long version of a conference paper published at TABLEAUX 2017. It contains proofs of the main results and fixes a bug. See the footnote on page 1 for detail

    Taming denumerable Markov decision processes with decisiveness

    Full text link
    Decisiveness has proven to be an elegant concept for denumerable Markov chains: it is general enough to encompass several natural classes of denumerable Markov chains, and is a sufficient condition for simple qualitative and approximate quantitative model checking algorithms to exist. In this paper, we explore how to extend the notion of decisiveness to Markov decision processes. Compared to Markov chains, the extra non-determinism can be resolved in an adversarial or cooperative way, yielding two natural notions of decisiveness. We then explore whether these notions yield model checking procedures concerning the infimum and supremum probabilities of reachability properties

    UML-based specification, validation, and log-file based verification of the Orion Pad Abort Software

    Get PDF
    This paper described the first end to end application of a novel light weight formal specification, validation, and verification technique. The technique is novel is two aspects. First, it uses an intuitive, familiar, and diagrammatic notation for formal specification, a notation that being Turing equivalent and supports the capture of real-life requirements. Second, the technique includes a computer aided approach for validating the correctness of requirements early in the development process, allowing sufficient time for the correction of ambiguous and underspecified requirements. In the verification phase the technique is based on off-line verification using log-files. This approach scales well and is applicable to almost every mission critical system, including real-time systems. The paper describes the application of this technique towards the specification, validation, and verification of the Pad Abort subsystem of NASA's Orion mission.Approved for public release; distribution is unlimited
    • …
    corecore