37,675 research outputs found

    Quantitative Verification: Formal Guarantees for Timeliness, Reliability and Performance

    Get PDF
    Computerised systems appear in almost all aspects of our daily lives, often in safety-critical scenarios such as embedded control systems in cars and aircraft or medical devices such as pacemakers and sensors. We are thus increasingly reliant on these systems working correctly, despite often operating in unpredictable or unreliable environments. Designers of such devices need ways to guarantee that they will operate in a reliable and efficient manner. Quantitative verification is a technique for analysing quantitative aspects of a system's design, such as timeliness, reliability or performance. It applies formal methods, based on a rigorous analysis of a mathematical model of the system, to automatically prove certain precisely specified properties, e.g. ``the airbag will always deploy within 20 milliseconds after a crash'' or ``the probability of both sensors failing simultaneously is less than 0.001''. The ability to formally guarantee quantitative properties of this kind is beneficial across a wide range of application domains. For example, in safety-critical systems, it may be essential to establish credible bounds on the probability with which certain failures or combinations of failures can occur. In embedded control systems, it is often important to comply with strict constraints on timing or resources. More generally, being able to derive guarantees on precisely specified levels of performance or efficiency is a valuable tool in the design of, for example, wireless networking protocols, robotic systems or power management algorithms, to name but a few. This report gives a short introduction to quantitative verification, focusing in particular on a widely used technique called model checking, and its generalisation to the analysis of quantitative aspects of a system such as timing, probabilistic behaviour or resource usage. The intended audience is industrial designers and developers of systems such as those highlighted above who could benefit from the application of quantitative verification,but lack expertise in formal verification or modelling

    Developing a distributed electronic health-record store for India

    Get PDF
    The DIGHT project is addressing the problem of building a scalable and highly available information store for the Electronic Health Records (EHRs) of the over one billion citizens of India

    Building Blocks for Control System Software

    Get PDF
    Software implementation of control laws for industrial systems seem straightforward, but is not. The computer code stemming from the control laws is mostly not more than 10 to 30% of the total. A building-block approach for embedded control system development is advocated to enable a fast and efficient software design process.\ud We have developed the CTJ library, Communicating Threads for JavaĀæ,\ud resulting in fundamental elements for creating building blocks to implement communication using channels. Due to the simulate-ability, our building block method is suitable for a concurrent engineering design approach. Furthermore, via a stepwise refinement process, using verification by simulation, the implementation trajectory can be done efficiently

    Model-based dependability analysis : state-of-the-art, challenges and future outlook

    Get PDF
    Abstract: Over the past two decades, the study of model-based dependability analysis has gathered significant research interest. Different approaches have been developed to automate and address various limitations of classical dependability techniques to contend with the increasing complexity and challenges of modern safety-critical system. Two leading paradigms have emerged, one which constructs predictive system failure models from component failure models compositionally using the topology of the system. The other utilizes design models - typically state automata - to explore system behaviour through fault injection. This paper reviews a number of prominent techniques under these two paradigms, and provides an insight into their working mechanism, applicability, strengths and challenges, as well as recent developments within these fields. We also discuss the emerging trends on integrated approaches and advanced analysis capabilities. Lastly, we outline the future outlook for model-based dependability analysis

    The Construction of Verification Models for Embedded Systems

    Get PDF
    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, we argue that modelling is not an act of irrationalism and unpredictable geniality, but follows rational arguments, that often remain implicit. In this paper we try to identify the tacit rationalism in the model construction as performed by most people doing modelling for verification. By explicating the different phases, arguments, and design decisions in the model construction, we try to develop guidelines that help to improve the process of model construction and the quality of models

    Obtaining Formal Models through Non-Monotonic Refinement

    Get PDF
    When designing a model for formal verification, we want to\ud be certain that what we proved about the model also holds for the system we modelled. This raises the question of whether our model represents the system, and what makes us confident about this. By performing so called, non-monotonic refinement in the modelling process, we make the steps and decisions explicit. This helps us to (1) increase the confidence that the model represents the system, (2) structure and organize the communication with domain experts and the problem owner, and (3) identify rational steps made while modelling. We focus on embedded control systems
    • ā€¦
    corecore