682 research outputs found

    Falsification of Cyber-Physical Systems with Robustness-Guided Black-Box Checking

    Full text link
    For exhaustive formal verification, industrial-scale cyber-physical systems (CPSs) are often too large and complex, and lightweight alternatives (e.g., monitoring and testing) have attracted the attention of both industrial practitioners and academic researchers. Falsification is one popular testing method of CPSs utilizing stochastic optimization. In state-of-the-art falsification methods, the result of the previous falsification trials is discarded, and we always try to falsify without any prior knowledge. To concisely memorize such prior information on the CPS model and exploit it, we employ Black-box checking (BBC), which is a combination of automata learning and model checking. Moreover, we enhance BBC using the robust semantics of STL formulas, which is the essential gadget in falsification. Our experiment results suggest that our robustness-guided BBC outperforms a state-of-the-art falsification tool.Comment: Accepted to HSCC 202

    Conformance Testing as Falsification for Cyber-Physical Systems

    Full text link
    In Model-Based Design of Cyber-Physical Systems (CPS), it is often desirable to develop several models of varying fidelity. Models of different fidelity levels can enable mathematical analysis of the model, control synthesis, faster simulation etc. Furthermore, when (automatically or manually) transitioning from a model to its implementation on an actual computational platform, then again two different versions of the same system are being developed. In all previous cases, it is necessary to define a rigorous notion of conformance between different models and between models and their implementations. This paper argues that conformance should be a measure of distance between systems. Albeit a range of theoretical distance notions exists, a way to compute such distances for industrial size systems and models has not been proposed yet. This paper addresses exactly this problem. A universal notion of conformance as closeness between systems is rigorously defined, and evidence is presented that this implies a number of other application-dependent conformance notions. An algorithm for detecting that two systems are not conformant is then proposed, which uses existing proven tools. A method is also proposed to measure the degree of conformance between two systems. The results are demonstrated on a range of models

    On Optimization-Based Falsification of Cyber-Physical Systems

    Get PDF
    In what is commonly referred to as cyber-physical systems (CPSs), computational and physical resources are closely interconnected. An example is the closed-loop behavior of perception, planning, and control algorithms, executing on a computer and interacting with a physical environment. Many CPSs are safety-critical, and it is thus important to guarantee that they behave according to given specifications that define the correct behavior. CPS models typically include differential equations, state machines, and code written in general-purpose programming languages. This heterogeneity makes it generally not feasible to use analytical methods to evaluate the system’s correctness. Instead, model-based testing of a simulation of the system is more viable. Optimization-based falsification is an approach to, using a simulation model, automatically check for the existence of input signals that make the CPS violate given specifications. Quantitative semantics estimate how far the specification is from being violated for a given scenario. The decision variables in the optimization problems are parameters that determine the type and shape of generated input signals. This thesis contributes to the increased efficiency of optimization-based falsification in four ways. (i) A method for using multiple quantitative semantics during optimization-based falsification. (ii) A direct search approach, called line-search falsification that prioritizes extreme values, which are known to often falsify specifications, and has a good balance between exploration and exploitation of the parameter space. (iii) An adaptation of Bayesian optimization that allows for injecting prior knowledge and uses a special acquisition function for finding falsifying points rather than the global minima. (iv) An investigation of different input signal parameterizations and their coverability of the space and time and frequency domains. The proposed methods have been implemented and evaluated on standard falsification benchmark problems. Based on these empirical studies, we show the efficiency of the proposed methods. Taken together, the proposed methods are important contributions to the falsification of CPSs and in enabling a more efficient falsification process

    Combining Machine Learning and Formal Methods for Complex Systems Design

    Get PDF
    During the last 20 years, model-based design has become a standard practice in many fields such as automotive, aerospace engineering, systems and synthetic biology. This approach allows a considerable improvement of the final product quality and reduces the overall prototyping costs. In these contexts, formal methods, such as temporal logics, and model checking approaches have been successfully applied. They allow a precise description and automatic verification of the prototype's requirements. In the recent past, the increasing market requests for performing and safer devices shows an unstoppable growth which inevitably brings to the creation of more and more complicated devices. The rise of cyber-physical systems, which are on their way to become massively pervasive, brings the complexity level to the next step and open many new challenges. First, the descriptive power of standard temporal logics is no more sufficient to handle all kind of requirements the designers need (consider, for example, non-functional requirements). Second, the standard model checking techniques are unable to manage such level of complexity (consider the well-known curse of state space explosion). In this thesis, we leverage machine learning techniques, active learning, and optimization approaches to face the challenges mentioned above. In particular, we define signal measure logic, a novel temporal logic suited to describe non-functional requirements. We also use evolutionary algorithms and signal temporal logic to tackle a supervised classification problem and a system design problem which involves multiple conflicting requirements (i.e., multi-objective optimization problems). Finally, we use an active learning approach, based on Gaussian processes, to deal with falsification problems in the automotive field and to solve a so-called threshold synthesis problem, discussing an epidemics case study.During the last 20 years, model-based design has become a standard practice in many fields such as automotive, aerospace engineering, systems and synthetic biology. This approach allows a considerable improvement of the final product quality and reduces the overall prototyping costs. In these contexts, formal methods, such as temporal logics, and model checking approaches have been successfully applied. They allow a precise description and automatic verification of the prototype's requirements. In the recent past, the increasing market requests for performing and safer devices shows an unstoppable growth which inevitably brings to the creation of more and more complicated devices. The rise of cyber-physical systems, which are on their way to become massively pervasive, brings the complexity level to the next step and open many new challenges. First, the descriptive power of standard temporal logics is no more sufficient to handle all kind of requirements the designers need (consider, for example, non-functional requirements). Second, the standard model checking techniques are unable to manage such level of complexity (consider the well-known curse of state space explosion). In this thesis, we leverage machine learning techniques, active learning, and optimization approaches to face the challenges mentioned above. In particular, we define signal measure logic, a novel temporal logic suited to describe non-functional requirements. We also use evolutionary algorithms and signal temporal logic to tackle a supervised classification problem and a system design problem which involves multiple conflicting requirements (i.e., multi-objective optimization problems). Finally, we use an active learning approach, based on Gaussian processes, to deal with falsification problems in the automotive field and to solve a so-called threshold synthesis problem, discussing an epidemics case study
    • …
    corecore