468 research outputs found

    On Minimal Cut Sets Representation with Binary Decision Diagrams

    Get PDF
    Since their introduction in form of a canonical representation of logical functions, the Binary Decision Diagrams (BDDs) gained a wide acceptance in numerous industrial applications. This paper summarizes the properties of BDD representation of Minimal Cut Sets (MCS) of Fault Tree (FT) models most typically encountered in nuclear energetics. Cut sets from MCS are defined as paths from the top BDD node to terminal nodes in the BDD, on which a quantitative and qualitative FT analysis (FTA) is performed. The core of the FTA on the BDDs is performed with help of two fundamental algorithms, one for conditional probability evaluation and another for the selection of cut sets. The accuracy of conditional probability evaluation represents an essential feature for an unbiased quantitative analysis, such as the top event probability or the determination of event importance measures. The cut set selection algorithm is shown in a generic version introducing logical predicates for its selection criteria. As it is known, the efficiency of depicted algorithms depends only on the number of BDD nodes used for the FT representation. In order to appraise the compactness of the BDD representation of FT models, their characteristics have herein been evaluated on several real-life models from the Nuclear Power Plant Krško. The extraordinariness of the compactness of the BDD representation reflects in its ability to implement advanced dynamic analysis (i.e. what-if) of FT models. The efficiency of such an approach is recognized by commercial vendors upgrading their FT Tools to new versions by implementing BDD based algorithms

    Selected Topics in Network Optimization: Aligning Binary Decision Diagrams for a Facility Location Problem and a Search Method for Dynamic Shortest Path Interdiction

    Get PDF
    This work deals with three different combinatorial optimization problems: minimizing the total size of a pair of binary decision diagrams (BDDs) under a certain structural property, a variant of the facility location problem, and a dynamic version of the Shortest-Path Interdiction (DSPI) problem. However, these problems all have the following core idea in common: They all stem from representing an optimization problem as a decision diagram. We begin from cases in which such a diagram representation of reasonable size might exist, but finding a small diagram is difficult to achieve. The first problem develops a heuristic for enforcing a structural property for a collection of BDDs, which allows them to be merged into a single one efficiently. In the second problem, we consider a specific combinatorial problem that allows for a natural representation by a pair of BDDs. We use the previous result and ideas developed earlier in the literature to reformulate this problem as a linear program over a single BDD. This approach enables us to obtain sensitivity information, while often enjoying runtimes comparable to a mixed integer program solved with a commercial solver, after we pay the computational overhead of building the diagram (e.g., when re-solving the problem using different costs, but the same graph topology). In the last part, we examine DSPI, for which building the full decision diagram is generally impractical. We formalize the concept of a game tree for the DSPI and design a heuristic based on the idea of building only selected parts of this exponentially-sized decision diagram (which is not binary any more). We use a Monte Carlo Tree Search framework to establish policies that are near optimal. To mitigate the size of the game tree, we leverage previously derived bounds for the DSPI and employ an alpha–beta pruning technique for minimax optimization. We highlight the practicality of these ideas in a series of numerical experiments

    A Fast Compiler for NetKAT

    Full text link
    High-level programming languages play a key role in a growing number of networking platforms, streamlining application development and enabling precise formal reasoning about network behavior. Unfortunately, current compilers only handle "local" programs that specify behavior in terms of hop-by-hop forwarding behavior, or modest extensions such as simple paths. To encode richer "global" behaviors, programmers must add extra state -- something that is tricky to get right and makes programs harder to write and maintain. Making matters worse, existing compilers can take tens of minutes to generate the forwarding state for the network, even on relatively small inputs. This forces programmers to waste time working around performance issues or even revert to using hardware-level APIs. This paper presents a new compiler for the NetKAT language that handles rich features including regular paths and virtual networks, and yet is several orders of magnitude faster than previous compilers. The compiler uses symbolic automata to calculate the extra state needed to implement "global" programs, and an intermediate representation based on binary decision diagrams to dramatically improve performance. We describe the design and implementation of three essential compiler stages: from virtual programs (which specify behavior in terms of virtual topologies) to global programs (which specify network-wide behavior in terms of physical topologies), from global programs to local programs (which specify behavior in terms of single-switch behavior), and from local programs to hardware-level forwarding tables. We present results from experiments on real-world benchmarks that quantify performance in terms of compilation time and forwarding table size

    Multi-core Decision Diagrams

    Get PDF
    Decision diagrams are fundamental data structures that revolutionized fields such as model checking, automated reasoning and decision processes. As performance gains in the current era mostly come from parallel processing, an ongoing challenge is to develop data structures and algorithms for modern multicore architectures. This chapter describes the parallelization of decision diagram operations as implemented in the parallel decision diagram package Sylvan, which allows sequential algorithms that use decision diagrams to exploit the power of multi-core machines

    Alternative Automata-based Approaches to Probabilistic Model Checking

    Get PDF
    In this thesis we focus on new methods for probabilistic model checking (PMC) with linear temporal logic (LTL). The standard approach translates an LTL formula into a deterministic ω-automaton with a double-exponential blow up. There are approaches for Markov chain analysis against LTL with exponential runtime, which motivates the search for non-deterministic automata with restricted forms of non-determinism that make them suitable for PMC. For MDPs, the approach via deterministic automata matches the double-exponential lower bound, but a practical application might benefit from approaches via non-deterministic automata. We first investigate good-for-games (GFG) automata. In GFG automata one can resolve the non-determinism for a finite prefix without knowing the infinite suffix and still obtain an accepting run for an accepted word. We explain that GFG automata are well-suited for MDP analysis on a theoretic level, but our experiments show that GFG automata cannot compete with deterministic automata. We have also researched another form of pseudo-determinism, namely unambiguity, where for every accepted word there is exactly one accepting run. We present a polynomial-time approach for PMC of Markov chains against specifications given by an unambiguous Büchi automaton (UBA). Its two key elements are the identification whether the induced probability is positive, and if so, the identification of a state set inducing probability 1. Additionally, we examine the new symbolic Muller acceptance described in the Hanoi Omega Automata Format, which we call Emerson-Lei acceptance. It is a positive Boolean formula over unconditional fairness constraints. We present a construction of small deterministic automata using Emerson-Lei acceptance. Deciding, whether an MDP has a positive maximal probability to satisfy an Emerson-Lei acceptance, is NP-complete. This fact has triggered a DPLL-based algorithm for deciding positiveness

    Applying Formal Methods to Networking: Theory, Techniques and Applications

    Full text link
    Despite its great importance, modern network infrastructure is remarkable for the lack of rigor in its engineering. The Internet which began as a research experiment was never designed to handle the users and applications it hosts today. The lack of formalization of the Internet architecture meant limited abstractions and modularity, especially for the control and management planes, thus requiring for every new need a new protocol built from scratch. This led to an unwieldy ossified Internet architecture resistant to any attempts at formal verification, and an Internet culture where expediency and pragmatism are favored over formal correctness. Fortunately, recent work in the space of clean slate Internet design---especially, the software defined networking (SDN) paradigm---offers the Internet community another chance to develop the right kind of architecture and abstractions. This has also led to a great resurgence in interest of applying formal methods to specification, verification, and synthesis of networking protocols and applications. In this paper, we present a self-contained tutorial of the formidable amount of work that has been done in formal methods, and present a survey of its applications to networking.Comment: 30 pages, submitted to IEEE Communications Surveys and Tutorial

    SAT-Solving in Practice, with a Tutorial Example from Supervisory Control

    Get PDF
    Satisfiability solving, the problem of deciding whether the variables of a propositional formula can be assigned in such a way that the formula evaluates to true, is one of the classic problems in computer science. It is of theoretical interest because it is the canonical NP-complete problem. It is of practical interest because modern SAT-solvers can be used to solve many important and practical problems. In this tutorial paper, we show briefly how such SAT-solvers are implemented, and point to some typical applications of them. Our aim is to provide sufficient information (much of it through the reference list) to kick-start researchers from new fields wishing to apply SAT-solvers to their problems. Supervisory control theory originated within the control community and is a framework for reasoning about a plant to be controlled and a specification that the closed-loop system must fulfil. This paper aims to bridge the gap between the computer science community and the control community by illustrating how SAT-based techniques can be used to solve some supervisory control related problems

    Fault Tree Analysis: a survey of the state-of-the-art in modeling, analysis and tools

    Get PDF
    Fault tree analysis (FTA) is a very prominent method to analyze the risks related to safety and economically critical assets, like power plants, airplanes, data centers and web shops. FTA methods comprise of a wide variety of modelling and analysis techniques, supported by a wide range of software tools. This paper surveys over 150 papers on fault tree analysis, providing an in-depth overview of the state-of-the-art in FTA. Concretely, we review standard fault trees, as well as extensions such as dynamic FT, repairable FT, and extended FT. For these models, we review both qualitative analysis methods, like cut sets and common cause failures, and quantitative techniques, including a wide variety of stochastic methods to compute failure probabilities. Numerous examples illustrate the various approaches, and tables present a quick overview of results
    corecore