2,481 research outputs found

    Conformance Testing as Falsification for Cyber-Physical Systems

    Full text link
    In Model-Based Design of Cyber-Physical Systems (CPS), it is often desirable to develop several models of varying fidelity. Models of different fidelity levels can enable mathematical analysis of the model, control synthesis, faster simulation etc. Furthermore, when (automatically or manually) transitioning from a model to its implementation on an actual computational platform, then again two different versions of the same system are being developed. In all previous cases, it is necessary to define a rigorous notion of conformance between different models and between models and their implementations. This paper argues that conformance should be a measure of distance between systems. Albeit a range of theoretical distance notions exists, a way to compute such distances for industrial size systems and models has not been proposed yet. This paper addresses exactly this problem. A universal notion of conformance as closeness between systems is rigorously defined, and evidence is presented that this implies a number of other application-dependent conformance notions. An algorithm for detecting that two systems are not conformant is then proposed, which uses existing proven tools. A method is also proposed to measure the degree of conformance between two systems. The results are demonstrated on a range of models

    SOTER: A Runtime Assurance Framework for Programming Safe Robotics Systems

    Full text link
    The recent drive towards achieving greater autonomy and intelligence in robotics has led to high levels of complexity. Autonomous robots increasingly depend on third party off-the-shelf components and complex machine-learning techniques. This trend makes it challenging to provide strong design-time certification of correct operation. To address these challenges, we present SOTER, a robotics programming framework with two key components: (1) a programming language for implementing and testing high-level reactive robotics software and (2) an integrated runtime assurance (RTA) system that helps enable the use of uncertified components, while still providing safety guarantees. SOTER provides language primitives to declaratively construct a RTA module consisting of an advanced, high-performance controller (uncertified), a safe, lower-performance controller (certified), and the desired safety specification. The framework provides a formal guarantee that a well-formed RTA module always satisfies the safety specification, without completely sacrificing performance by using higher performance uncertified components whenever safe. SOTER allows the complex robotics software stack to be constructed as a composition of RTA modules, where each uncertified component is protected using a RTA module. To demonstrate the efficacy of our framework, we consider a real-world case-study of building a safe drone surveillance system. Our experiments both in simulation and on actual drones show that the SOTER-enabled RTA ensures the safety of the system, including when untrusted third-party components have bugs or deviate from the desired behavior

    On-Line Monitoring for Temporal Logic Robustness

    Full text link
    In this paper, we provide a Dynamic Programming algorithm for on-line monitoring of the state robustness of Metric Temporal Logic specifications with past time operators. We compute the robustness of MTL with unbounded past and bounded future temporal operators MTL over sampled traces of Cyber-Physical Systems. We implemented our tool in Matlab as a Simulink block that can be used in any Simulink model. We experimentally demonstrate that the overhead of the MTL robustness monitoring is acceptable for certain classes of practical specifications

    Developing Methods of Obtaining Quality Failure Information from Complex Systems

    Get PDF
    The complexity in most engineering systems is constantly growing due to ever-increasing technological advancements. This result in a corresponding need for methods that adequately account for the reliability of such systems based on failure information from components that make up these systems. This dissertation presents an approach to validating qualitative function failure results from model abstraction details. The impact of the level of detail available to a system designer during conceptual stages of design is considered for failure space exploration in a complex system. Specifically, the study develops an efficient approach towards detailed function and behavior modeling required for complex system analyses. In addition, a comprehensive research and documentation of existing function failure analysis methodologies is also synthesized into identified structural groupings. Using simulations, known governing equations are evaluated for components and system models to study responses to faults by accounting for detailed failure scenarios, component behaviors, fault propagation paths, and overall system performance. The components were simulated at nominal states and varying degrees of fault representing actual modes of operation. Information on product design and provisions on expected working conditions of components were used in the simulations to address normally overlooked areas during installation. The results of system model simulations were investigated using clustering analysis to develop an efficient grouping method and measure of confidence for the obtained results. The intellectual merit of this work is the use of a simulation based approach in studying how generated failure scenarios reveal component fault interactions leading to a better understanding of fault propagation within design models. The information from using varying fidelity models for system analysis help in identifying models that are sufficient enough at the conceptual design stages to highlight potential faults. This will reduce resources such as cost, manpower and time spent during system design. A broader impact of the project is to help design engineers identifying critical components, quantifying risks associated with using particular components in their prototypes early in the design process and help improving fault tolerant system designs. This research looks to eventually establishing a baseline for validating and comparing theories of complex systems analysis

    2020 NASA Technology Taxonomy

    Get PDF
    This document is an update (new photos used) of the PDF version of the 2020 NASA Technology Taxonomy that will be available to download on the OCT Public Website. The updated 2020 NASA Technology Taxonomy, or "technology dictionary", uses a technology discipline based approach that realigns like-technologies independent of their application within the NASA mission portfolio. This tool is meant to serve as a common technology discipline-based communication tool across the agency and with its partners in other government agencies, academia, industry, and across the world

    Conformance Testing for Stochastic Cyber-Physical Systems

    Full text link
    Conformance is defined as a measure of distance between the behaviors of two dynamical systems. The notion of conformance can accelerate system design when models of varying fidelities are available on which analysis and control design can be done more efficiently. Ultimately, conformance can capture distance between design models and their real implementations and thus aid in robust system design. In this paper, we are interested in the conformance of stochastic dynamical systems. We argue that probabilistic reasoning over the distribution of distances between model trajectories is a good measure for stochastic conformance. Additionally, we propose the non-conformance risk to reason about the risk of stochastic systems not being conformant. We show that both notions have the desirable transference property, meaning that conformant systems satisfy similar system specifications, i.e., if the first model satisfies a desirable specification, the second model will satisfy (nearly) the same specification. Lastly, we propose how stochastic conformance and the non-conformance risk can be estimated from data using statistical tools such as conformal prediction. We present empirical evaluations of our method on an F-16 aircraft, an autonomous vehicle, a spacecraft, and Dubin's vehicle
    corecore