18 research outputs found

    Practical applications of probabilistic model checking to communication protocols

    Get PDF
    Probabilistic model checking is a formal verification technique for the analysis of systems that exhibit stochastic behaviour. It has been successfully employed in an extremely wide array of application domains including, for example, communication and multimedia protocols, security and power management. In this chapter we focus on the applicability of these techniques to the analysis of communication protocols. An analysis of the performance of such systems must successfully incorporate several crucial aspects, including concurrency between multiple components, real-time constraints and randomisation. Probabilistic model checking, in particular using probabilistic timed automata, is well suited to such an analysis. We provide an overview of this area, with emphasis on an industrially relevant case study: the IEEE 802.3 (CSMA/CD) protocol. We also discuss two contrasting approaches to the implementation of probabilistic model checking, namely those based on numerical computation and those based on discrete-event simulation. Using results from the two tools PRISM and APMC, we summarise the advantages, disadvantages and trade-offs associated with these techniques

    Evaluating the reliability of NAND multiplexing with PRISM

    Get PDF
    Probabilistic-model checking is a formal verification technique for analyzing the reliability and performance of systems exhibiting stochastic behavior. In this paper, we demonstrate the applicability of this approach and, in particular, the probabilistic-model-checking tool PRISM to the evaluation of reliability and redundancy of defect-tolerant systems in the field of computer-aided design. We illustrate the technique with an example due to von Neumann, namely NAND multiplexing. We show how, having constructed a model of a defect-tolerant system incorporating probabilistic assumptions about its defects, it is straightforward to compute a range of reliability measures and investigate how they are affected by slight variations in the behavior of the system. This allows a designer to evaluate, for example, the tradeoff between redundancy and reliability in the design. We also highlight errors in analytically computed reliability bounds, recently published for the same case study

    Evaluating the reliability of NAND multiplexing with PRISM

    Get PDF
    Probabilistic-model checking is a formal verification technique for analyzing the reliability and performance of systems exhibiting stochastic behavior. In this paper, we demonstrate the applicability of this approach and, in particular, the probabilistic-model-checking tool PRISM to the evaluation of reliability and redundancy of defect-tolerant systems in the field of computer-aided design. We illustrate the technique with an example due to von Neumann, namely NAND multiplexing. We show how, having constructed a model of a defect-tolerant system incorporating probabilistic assumptions about its defects, it is straightforward to compute a range of reliability measures and investigate how they are affected by slight variations in the behavior of the system. This allows a designer to evaluate, for example, the tradeoff between redundancy and reliability in the design. We also highlight errors in analytically computed reliability bounds, recently published for the same case study

    A Proposed Mechanism for Bluetooth Low Enegry Network by Adjusting Network Parameter

    Get PDF
    Bluetooth Low Energy (BLE) developed from traditional Bluetooth innovations for empowering short-range communication in various systems and services. BLE has many points of interest over traditional Bluetooth advancements, including low power utilization and low cost deployment. As of late, a not very many number of research studies have been directed to enhance device discovery procedure of BLE. However, these reviews have still a few constraints. Earlier reviews have accepted that advertising PDUs are instantly handled the length of they are gotten effectively by a scanner. Essentially, notwithstanding, BLE devices may encounter heaps of impacts because of dispute among neighbors, especially in a swarmed situation. With expanding number of BLE devices, delays of both connection set up and device discovery keep exponential development, which could impact client involvement regarding either time or energy utilization. In this paper, an enhanced mechanism is proposed to empower BLE promoters and scanners to take in the system conflict and change their parameters in like manner, to accomplish fast discovery latency. Through broad simulations, the proposed mechanism has demonstrated its effectiveness to reduce sudden long latency in swarmed BLE systems

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Analysis of a gossip protocol in PRISM

    Full text link

    Symbolic Magnifying Lens Abstraction in Markov Decision Processes

    Get PDF
    In this paper, we combine abstraction-refinement and symbolic techniques to fight the state-space explosion problem when model checking Markov decision processes (MDPs). The abstract-refinement technique, called "magnifying-lens abstraction" (MLA), partitions the state-space into regions and computes upper and lower bounds for reachability and safety properties on the regions, rather than the states. To compute such bounds, MLA iterates over the regions, analyzing the concrete states of each region in turn - as if one was sliding a magnifying lens across the system to view the states. The algorithm adaptively refines the regions, using smaller regions where more detail is required, until the difference between the bounds is below a specified accuracy. The symbolic technique is based on multi-terminal binary decision diagrams (MTBDDs) which have been used extensively to provide compact encodings of probabilistic models. We introduce a symbolic version of the MLA algorithm, called "symbolic MLA", which combines the power of both practical techniques when verifying MDPs. An implementation of symbolic MLA in the probabilistic model checker PRISM and experimental results to illustrate the advantages of our approach are presented

    Design and analysis of DNA strand displacement devices using probabilistic model checking

    Get PDF
    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs

    On Content Models for Proximity Services

    Get PDF
    First time introduced in Release 12 of the 3GPP specifications, Proximity Services (ProSe in the above-mentioned specification) is a Device-to-Device (D2D) technology that allows two devices to detect each other and to communicate directly without traversing the Base Station or core network. In other words, it is a technology that is oriented (ultimately) on the direct connection of two devices. In this article, we are promoting the idea that proximity services are more than just support for a direct connection (in fact, search for candidates for a direct connection). The paper discusses content models (that is, information dissemination models) based on proximity data. In this case, a direct connection is simply one of the possible options for disseminating information

    LPcomS: Towards a Low Power Wireless Smart-Shoe System for Gait Analysis in People with Disabilities

    Get PDF
    Gait analysis using smart sensor technology is an important medical diagnostic process and has many applications in rehabilitation, therapy and exercise training. In this thesis, we present a low power wireless smart-shoe system (LPcomS) to analyze different functional postures and characteristics of gait while walking. We have designed and implemented a smart-shoe with a Bluetooth communication module to unobtrusively collect data using smartphone in any environment. With the design of a shoe insole equipped with four pressure sensors, the foot pressure is been collected, and those data are used to obtain accurate gait pattern of a patient. With our proposed portable sensing system and effective low power communication algorithm, the smart-shoe system enables detailed gait analysis. Experimentation and verification is conducted on multiple subjects with different gait including free gait. The sensor outputs, with gait analysis acquired from the experiment, are presented in this thesis
    corecore