130 research outputs found

    Counterexample Generation in Probabilistic Model Checking

    Get PDF
    Providing evidence for the refutation of a property is an essential, if not the most important, feature of model checking. This paper considers algorithms for counterexample generation for probabilistic CTL formulae in discrete-time Markov chains. Finding the strongest evidence (i.e., the most probable path) violating a (bounded) until-formula is shown to be reducible to a single-source (hop-constrained) shortest path problem. Counterexamples of smallest size that deviate most from the required probability bound can be obtained by applying (small amendments to) k-shortest (hop-constrained) paths algorithms. These results can be extended to Markov chains with rewards, to LTL model checking, and are useful for Markov decision processes. Experimental results show that typically the size of a counterexample is excessive. To obtain much more compact representations, we present a simple algorithm to generate (minimal) regular expressions that can act as counterexamples. The feasibility of our approach is illustrated by means of two communication protocols: leader election in an anonymous ring network and the Crowds protocol

    Scalable Synthesis and Verification: Towards Reliable Autonomy

    Get PDF
    We have seen the growing deployment of autonomous systems in our daily life, ranging from safety-critical self-driving cars to dialogue agents. While impactful and impressive, these systems do not often come with guarantees and are not rigorously evaluated for failure cases. This is in part due to the limited scalability of tools available for designing correct-by-construction systems, or verifying them posthoc. Another key limitation is the lack of availability of models for the complex environments with which autonomous systems often have to interact with. In the direction of overcoming these above mentioned bottlenecks to designing reliable autonomous systems, this thesis makes contributions along three fronts. First, we develop an approach for parallelized synthesis from linear-time temporal logic Specifications corresponding to the generalized reactivity (1) fragment. We begin by identifying a special case corresponding to singleton liveness goals that allows for a decomposition of the synthesis problem, which facilitates parallelized synthesis. Based on the intuition from this special case, we propose a more generalized approach for parallelized synthesis that relies on identifying equicontrollable states. Second, we consider learning-based approaches to enable verification at scale for complex systems, and for autonomous systems that interact with black-box environments. For the former, we propose a new abstraction refinement procedure based on machine learning to improve the performance of nonlinear constraint solving algorithms on large-scale problems. For the latter, we present a data-driven approach based on chance-constrained optimization that allows for a system to be evaluated for specification conformance without an accurate model of the environment. We demonstrate this approach on several tasks, including a lane-change scenario with real-world driving data. Lastly, we consider the problem of interpreting and verifying learning-based components such as neural networks. We introduce a new method based on Craig's interpolants for computing compact symbolic abstractions of pre-images for neural networks. Our approach relies on iteratively computing approximations that provably overapproximate and underapproximate the pre-images at all layers. Further, building on existing work for training neural networks for verifiability in the classification setting, we propose extensions that allow us to generalize the approach to more general architectures and temporal specifications.</p

    IST Austria Thesis

    Get PDF
    This dissertation concerns the automatic verification of probabilistic systems and programs with arrays by statistical and logical methods. Although statistical and logical methods are different in nature, we show that they can be successfully combined for system analysis. In the first part of the dissertation we present a new statistical algorithm for the verification of probabilistic systems with respect to unbounded properties, including linear temporal logic. Our algorithm often performs faster than the previous approaches, and at the same time requires less information about the system. In addition, our method can be generalized to unbounded quantitative properties such as mean-payoff bounds. In the second part, we introduce two techniques for comparing probabilistic systems. Probabilistic systems are typically compared using the notion of equivalence, which requires the systems to have the equal probability of all behaviors. However, this notion is often too strict, since probabilities are typically only empirically estimated, and any imprecision may break the relation between processes. On the one hand, we propose to replace the Boolean notion of equivalence by a quantitative distance of similarity. For this purpose, we introduce a statistical framework for estimating distances between Markov chains based on their simulation runs, and we investigate which distances can be approximated in our framework. On the other hand, we propose to compare systems with respect to a new qualitative logic, which expresses that behaviors occur with probability one or a positive probability. This qualitative analysis is robust with respect to modeling errors and applicable to many domains. In the last part, we present a new quantifier-free logic for integer arrays, which allows us to express counting. Counting properties are prevalent in array-manipulating programs, however they cannot be expressed in the quantified fragments of the theory of arrays. We present a decision procedure for our logic, and provide several complexity results

    Explicit Model Checking of Very Large MDP using Partitioning and Secondary Storage

    Full text link
    The applicability of model checking is hindered by the state space explosion problem in combination with limited amounts of main memory. To extend its reach, the large available capacities of secondary storage such as hard disks can be exploited. Due to the specific performance characteristics of secondary storage technologies, specialised algorithms are required. In this paper, we present a technique to use secondary storage for probabilistic model checking of Markov decision processes. It combines state space exploration based on partitioning with a block-iterative variant of value iteration over the same partitions for the analysis of probabilistic reachability and expected-reward properties. A sparse matrix-like representation is used to store partitions on secondary storage in a compact format. All file accesses are sequential, and compression can be used without affecting runtime. The technique has been implemented within the Modest Toolset. We evaluate its performance on several benchmark models of up to 3.5 billion states. In the analysis of time-bounded properties on real-time models, our method neutralises the state space explosion induced by the time bound in its entirety.Comment: The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-24953-7_1

    Synthesizing stream control

    Get PDF
    For the management of reactive systems, controllers must coordinate time, data streams, and data transformations, all joint by the high level perspective of their control flow. This control flow is required to drive the system correctly and continuously, which turns the development into a challenge. The process is error-prone, time consuming, unintuitive, and costly. An attractive alternative is to synthesize the system instead, where the developer only needs to specify the desired behavior. The synthesis engine then automatically takes care of all the technical details. However, while current algorithms for the synthesis of reactive systems are well-suited to handle control, they fail on complex data transformations due to the complexity of the comparably large data space. Thus, to overcome the challenge of explicitly handling the data we must separate data and control. We introduce Temporal Stream Logic (TSL), a logic which exclusively argues about the control of the controller, while treating data and functional transformations as interchangeable black-boxes. In TSL it is possible to specify control flow properties independently of the complexity of the handled data. Furthermore, with TSL at hand a synthesis engine can check for realizability, even without a concrete implementation of the data transformations. We present a modular development framework that first uses synthesis to identify the high level control flow of a program. If successful, the created control flow then is extended with concrete data transformations in order to be compiled into a final executable. Our results also show that the current synthesis approaches cannot replace existing manual development work flows immediately. During the development of a reactive system, the developer still may use incomplete or faulty specifications at first, that need the be refined after a subsequent inspection. In the worst case, constraints are contradictory or miss important assumptions, which leads to unrealizable specifications. In both scenarios, the developer needs additional feedback from the synthesis engine to debug errors for finally improving the system specification. To this end, we explore two further possible improvements. On the one hand, we consider output sensitive synthesis metrics, which allow to synthesize simple and well structured solutions that help the developer to understand and verify the underlying behavior quickly. On the other hand, we consider the extension of delay, whose requirement is a frequent reason for unrealizability. With both methods at hand, we resolve the aforementioned problems and therefore help the developer in the development phase with the effective creation of a safe and correct reactive system.Um reaktive Systeme zu regeln müssen Steuergeräte Zeit, Datenströme und Datentransformationen koordinieren, die durch den übergeordneten Kontrollfluss zusammengefasst werden. Die Aufgabe des Kontrollflusses ist es das System korrekt und dauerhaft zu betreiben. Die Entwicklung solcher Systeme wird dadurch zu einer Herausforderung, denn der Prozess ist fehleranfällig, zeitraubend, unintuitiv und kostspielig. Eine attraktive Alternative ist es stattdessen das System zu synthetisieren, wobei der Entwickler nur das gewünschte Verhalten des Systems festlegt. Der Syntheseapparat kümmert sich dann automatisch um alle technischen Details. Während aktuelle Algorithmen für die Synthese von reaktiven Systemen erfolgreich mit dem Kontrollanteil umgehen können, versagen sie jedoch, sobald komplexe Datentransformationen hinzukommen, aufgrund der Komplexität des vergleichsweise großen Datenraums. Daten und Kontrolle müssen demnach getrennt behandelt werden, um auch große Datenräumen effizient handhaben zu können. Wir präsentieren Temporal Stream Logic (TSL), eine Logik die ausschließlich die Kontrolle einer Steuerung betrachtet, wohingegen Daten und funktionale Datentransformationen als austauschbare Blackboxen gehandhabt werden. In TSL ist es möglich Kontrollflusseigenschaften unabhängig von der Komplexität der zugrunde liegenden Daten zu beschreiben. Des Weiteren kann ein auf TSL beruhender Syntheseapparat die Realisierbarkeit einer Spezifikation prüfen, selbst ohne die konkreten Implementierungen der Datentransformationen zu kennen. Wir präsentieren ein modulares Grundgerüst für die Entwicklung. Es verwendet zunächst den Syntheseapparat um den übergeordneten Kontrollfluss zu erzeugen. Ist dies erfolgreich, so wird der resultierende Kontrollfluss um die konkreten Implementierungen der Datentransformationen erweitert und anschließend zu einer ausführbare Anwendung kompiliert. Wir zeigen auch auf, dass bisherige Syntheseverfahren bereits existierende manuelle Entwicklungsprozesse noch nicht instantan ersetzen können. Im Verlauf der Entwicklung ist es auch weiterhin möglich, dass der Entwickler zunächst unvollständige oder fehlerhafte Spezifikationen erstellt, welche dann erst nach genauerer Betrachtung des synthetisierten Systems weiter verbessert werden können. Im schlimmsten Fall sind Anforderungen inkonsistent oder wichtige Annahmen über das Verhalten fehlen, was zu unrealisierbaren Spezifikationen führt. In beiden Fällen benötigt der Entwickler zusätzliche Rückmeldungen vom Syntheseapparat, um Fehler zu identifizieren und die Spezifikation schlussendlich zu verbessern. In diesem Zusammenhang untersuchen wir zwei mögliche Erweiterungen. Zum einen betrachten wir ausgabeabhängige Metriken, die es dem Entwickler erlauben einfache und wohlstrukturierte Lösungen zu synthetisieren die verständlich sind und deren Verhalten einfach zu verifizieren ist. Zum anderen betrachten wir die Erweiterung um Verzögerungen, welche eine der Hauptursachen für Unrealisierbarkeit darstellen. Mit beiden Methoden beheben wir die jeweils zuvor genannten Probleme und helfen damit dem Entwickler während der Entwicklungsphase auch wirklich das reaktive System zu kreieren, dass er sich auch tatsächlich vorstellt

    Computing Bounds for Counter Automata

    Get PDF
    Qualitative formal verification, that seeks Boolean answers about the behavior of a system, is often insufficient for practical purposes. Observing quantitative information is of greater interest, e.g. for the calibration of a battery or a real-time scheduler. Historically, the focus has been on quantities in continuous domain, but recent years showed a renewed interest for discrete quantitative domains. Counter Automata (CA) is a quantitative extension of classical omega-automata. Recently a nice theory has been developed for them that extends the qualitative setting, with counterparts in terms of logics, automata and algebraic structure. We propose an adaptation, with plenty of practical applications,  of this formalism to express properties over discrete quantitative domains. The behavior of a Counter Automaton defines a function from infinite words to integers. Finding the bounds of such a function over a given set of words can be seen as an extension of qualitative universal and existential model-checking. Although the problem of determining whether such bounds are finite have already been addressed, efficient algorithms to compute their exact values still lack. We propose an non-naive method for the computation of the exact values of these bounds. It relies on a generalization of the emptiness problem of omega-automata. To solve this generalized emptiness problem, we propose an algorithm that extends emptiness check algorithms based on SCC enumeration.
    • …
    corecore