8,034 research outputs found

    Hybrid performance modelling of opportunistic networks

    Get PDF
    We demonstrate the modelling of opportunistic networks using the process algebra stochastic HYPE. Network traffic is modelled as continuous flows, contact between nodes in the network is modelled stochastically, and instantaneous decisions are modelled as discrete events. Our model describes a network of stationary video sensors with a mobile ferry which collects data from the sensors and delivers it to the base station. We consider different mobility models and different buffer sizes for the ferries. This case study illustrates the flexibility and expressive power of stochastic HYPE. We also discuss the software that enables us to describe stochastic HYPE models and simulate them.Comment: In Proceedings QAPL 2012, arXiv:1207.055

    Compositional Falsification of Cyber-Physical Systems with Machine Learning Components

    Full text link
    Cyber-physical systems (CPS), such as automotive systems, are starting to include sophisticated machine learning (ML) components. Their correctness, therefore, depends on properties of the inner ML modules. While learning algorithms aim to generalize from examples, they are only as good as the examples provided, and recent efforts have shown that they can produce inconsistent output under small adversarial perturbations. This raises the question: can the output from learning components can lead to a failure of the entire CPS? In this work, we address this question by formulating it as a problem of falsifying signal temporal logic (STL) specifications for CPS with ML components. We propose a compositional falsification framework where a temporal logic falsifier and a machine learning analyzer cooperate with the aim of finding falsifying executions of the considered model. The efficacy of the proposed technique is shown on an automatic emergency braking system model with a perception component based on deep neural networks

    Compositional synthesis of temporal fault trees from state machines

    Get PDF
    Dependability analysis of a dynamic system which is embedded with several complex interrelated components raises two main problems. First, it is difficult to represent in a single coherent and complete picture how the system and its constituent parts behave in conditions of failure. Second, the analysis can be unmanageable due to a considerable number of failure events, which increases with the number of components involved. To remedy this problem, in this paper we outline an analysis approach that converts failure behavioural models (state machines) to temporal fault trees (TFTs), which can then be analysed using Pandora -- a recent technique for introducing temporal logic to fault trees. The approach is compositional and potentially more scalable, as it relies on the synthesis of large system TFTs from smaller component TFTs. We show, by using a Generic Triple Redundant (GTR) system, how the approach enables a more accurate and full analysis of an increasingly complex system

    Model checking medium access control for sensor networks

    Get PDF
    We describe verification of S-MAC, a medium access control protocol designed for wireless sensor networks, by means of the PRISM model checker. The S-MAC protocol is built on top of the IEEE 802.11 standard for wireless ad hoc networks and, as such, it uses the same randomised backoff procedure as a means to avoid collision. In order to minimise energy consumption, in S-MAC, nodes are periodically put into a sleep state. Synchronisation of the sleeping schedules is necessary for the nodes to be able to communicate. Intuitively, energy saving obtained through a periodic sleep mechanism will be at the expense of performance. In previous work on S-MAC verification, a combination of analytical techniques and simulation has been used to confirm the correctness of this intuition for a simplified (abstract) version of the protocol in which the initial schedules coordination phase is assumed correct. We show how we have used the PRISM model checker to verify the behaviour of S-MAC and compare it to that of IEEE 802.11

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Optimal modelling and experimentation for the improved sustainability of microfluidic chemical technology design

    Get PDF
    Optimization of the dynamics and control of chemical processes holds the promise of improved sustainability for chemical technology by minimizing resource wastage. Anecdotally, chemical plant may be substantially over designed, say by 35-50%, due to designers taking account of uncertainties by providing greater flexibility. Once the plant is commissioned, techniques of nonlinear dynamics analysis can be used by process systems engineers to recoup some of this overdesign by optimization of the plant operation through tighter control. At the design stage, coupling the experimentation with data assimilation into the model, whilst using the partially informed, semi-empirical model to predict from parametric sensitivity studies which experiments to run should optimally improve the model. This approach has been demonstrated for optimal experimentation, but limited to a differential algebraic model of the process. Typically, such models for online monitoring have been limited to low dimensions. Recently it has been demonstrated that inverse methods such as data assimilation can be applied to PDE systems with algebraic constraints, a substantially more complicated parameter estimation using finite element multiphysics modelling. Parametric sensitivity can be used from such semi-empirical models to predict the optimum placement of sensors to be used to collect data that optimally informs the model for a microfluidic sensor system. This coupled optimum modelling and experiment procedure is ambitious in the scale of the modelling problem, as well as in the scale of the application - a microfluidic device. In general, microfluidic devices are sufficiently easy to fabricate, control, and monitor that they form an ideal platform for developing high dimensional spatio-temporal models for simultaneously coupling with experimentation. As chemical microreactors already promise low raw materials wastage through tight control of reagent contacting, improved design techniques should be able to augment optimal control systems to achieve very low resource wastage. In this paper, we discuss how the paradigm for optimal modelling and experimentation should be developed and foreshadow the exploitation of this methodology for the development of chemical microreactors and microfluidic sensors for online monitoring of chemical processes. Improvement in both of these areas bodes to improve the sustainability of chemical processes through innovative technology. (C) 2008 The Institution of Chemical Engineers. Published by Elsevier B.V. All rights reserved

    Probabilistic Bisimulations for PCTL Model Checking of Interval MDPs

    Full text link
    Verification of PCTL properties of MDPs with convex uncertainties has been investigated recently by Puggelli et al. However, model checking algorithms typically suffer from state space explosion. In this paper, we address probabilistic bisimulation to reduce the size of such an MDPs while preserving PCTL properties it satisfies. We discuss different interpretations of uncertainty in the models which are studied in the literature and that result in two different definitions of bisimulations. We give algorithms to compute the quotients of these bisimulations in time polynomial in the size of the model and exponential in the uncertain branching. Finally, we show by a case study that large models in practice can have small branching and that a substantial state space reduction can be achieved by our approach.Comment: In Proceedings SynCoP 2014, arXiv:1403.784

    Self-Evaluation Applied Mathematics 2003-2008 University of Twente

    Get PDF
    This report contains the self-study for the research assessment of the Department of Applied Mathematics (AM) of the Faculty of Electrical Engineering, Mathematics and Computer Science (EEMCS) at the University of Twente (UT). The report provides the information for the Research Assessment Committee for Applied Mathematics, dealing with mathematical sciences at the three universities of technology in the Netherlands. It describes the state of affairs pertaining to the period 1 January 2003 to 31 December 2008
    corecore