53 research outputs found

    Presentation of the 9th Edition of the Model Checking Contest.

    Get PDF
    International audience; The Model Checking Contest (MCC) is an annual competition of software tools for model checking. Tools must process an increasing benchmark gathered from the whole community and may participate in various examinations: state space generation, computation of global properties, computation of some upper bounds in the model, evaluation of reachability formulas, evaluation of CTL formulas, and evaluation of LTL formulas.For each examination and each model instance, participating tools are provided with up to 3600 s and 16 gigabyte of memory. Then, tool answers are analyzed and confronted to the results produced by other competing tools to detect diverging answers (which are quite rare at this stage of the competition, and lead to penalties).For each examination, golden, silver, and bronze medals are attributed to the three best tools. CPU usage and memory consumption are reported, which is also valuable information for tool developers

    Advances in Functional Decomposition: Theory and Applications

    Get PDF
    Functional decomposition aims at finding efficient representations for Boolean functions. It is used in many applications, including multi-level logic synthesis, formal verification, and testing. This dissertation presents novel heuristic algorithms for functional decomposition. These algorithms take advantage of suitable representations of the Boolean functions in order to be efficient. The first two algorithms compute simple-disjoint and disjoint-support decompositions. They are based on representing the target function by a Reduced Ordered Binary Decision Diagram (BDD). Unlike other BDD-based algorithms, the presented ones can deal with larger target functions and produce more decompositions without requiring expensive manipulations of the representation, particularly BDD reordering. The third algorithm also finds disjoint-support decompositions, but it is based on a technique which integrates circuit graph analysis and BDD-based decomposition. The combination of the two approaches results in an algorithm which is more robust than a purely BDD-based one, and that improves both the quality of the results and the running time. The fourth algorithm uses circuit graph analysis to obtain non-disjoint decompositions. We show that the problem of computing non-disjoint decompositions can be reduced to the problem of computing multiple-vertex dominators. We also prove that multiple-vertex dominators can be found in polynomial time. This result is important because there is no known polynomial time algorithm for computing all non-disjoint decompositions of a Boolean function. The fifth algorithm provides an efficient means to decompose a function at the circuit graph level, by using information derived from a BDD representation. This is done without the expensive circuit re-synthesis normally associated with BDD-based decomposition approaches. Finally we present two publications that resulted from the many detours we have taken along the winding path of our research

    Structural model checking

    Get PDF
    The introduction of symbolic approaches, based on Binary Decision Diagrams (BDD), to Model Checking has led to significant improvements in Formal Verification, by allowing the analysis of very large systems, such as complex circuit designs. These were previously beyond the reach of traditional, explicit methods, due to the state space explosion phenomenon. However, after the initial success, the BDD technology has peaked, due to a similar problem, the BDD explosion.;We present a new approach to symbolic Model Checking that is based on exploiting the system structure. This technique is characterized by several unique features, including an encoding of states with Multiway Decision Diagrams (MDD) and of transitions with boolean Kronecker matrices. This approach naturally captures the property of event locality, inherently present in the class of globally asynchronous/locally synchronous systems.;The most important contribution of our work is the saturation algorithm for state space construction. Using saturation, the peak size of the MDD (luring the exploration is drastically reduced, often to sizes equal or comparable to the final MDD size, which makes it optimal in these terms. Subsequently, saturation can achieve similar reductions in runtimes. When compared to the leading state-of-the art tools based on traditional symbolic approaches, saturation is up to 100,000 times faster and uses up to 1,000 times less memory. This enables our approach to study much larger systems than ever considered. Following the success in state space exploration, we extend the applicability of the saturation algorithm to CTL Model Checking, and also to efficient generation of shortest length counterexamples for safety properties, with similar results.;This approach to automatic verification is implemented in the tool SMART. We test the new model checker on a real life, industrial size application: the NASA Runway Safety Monitor (RSM). The analysis exposes a number of potential problems with the decision procedure designed to signal all hazardous situations during takeoff and landing procedures on runways. Attempts to verify RSM with other model checkers (NuSMV, SPIN) fail due to excessive memory consumption, showing that our structural method is superior to existing symbolic approaches

    SiMAMT: A Framework for Strategy-Based Multi-Agent Multi-Team Systems

    Get PDF
    Multi-agent multi-team systems are commonly seen in environments where hierarchical layers of goals are at play. For example, theater-wide combat scenarios where multiple levels of command and control are required for proper execution of goals from the general to the foot soldier. Similar structures can be seen in game environments, where agents work together as teams to compete with other teams. The different agents within the same team must, while maintaining their own ‘personality’, work together and coordinate with each other to achieve a common team goal. This research develops strategy-based multi-agent multi-team systems, where strategy is framed as an instrument at the team level to coordinate the multiple agents of a team in a cohesive way. A formal specification of strategy and strategy-based multi-agent multi-team systems is provided. A framework is developed called SiMAMT (strategy- based multi-agent multi-team systems). The different components of the framework, including strategy simulation, strategy inference, strategy evaluation, and strategy selection are described. A graph-matching approximation algorithm is also developed to support effective and efficient strategy inference. Examples and experimental results are given throughout to illustrate the proposed framework, including each of its composite elements, and its overall efficacy. This research make several contributions to the field of multi-agent multi-team systems: a specification for strategy and strategy-based systems, and a framework for implementing them in real-world, interactive-time scenarios; a robust simulation space for such complex and intricate interaction; an approximation algorithm that allows for strategy inference within these systems in interactive-time; experimental results that verify the various sub-elements along with a full-scale integration experiment showing the efficacy of the proposed framework

    Tools and Algorithms for the Construction and Analysis of Systems

    Get PDF
    This book is Open Access under a CC BY licence. The LNCS 11427 and 11428 proceedings set constitutes the proceedings of the 25th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2019, which took place in Prague, Czech Republic, in April 2019, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2019. The total of 42 full and 8 short tool demo papers presented in these volumes was carefully reviewed and selected from 164 submissions. The papers are organized in topical sections as follows: Part I: SAT and SMT, SAT solving and theorem proving; verification and analysis; model checking; tool demo; and machine learning. Part II: concurrent and distributed systems; monitoring and runtime verification; hybrid and stochastic systems; synthesis; symbolic verification; and safety and fault-tolerant systems

    Probabilistic Inference Using Partitioned Bayesian Networks:Introducing a Compositional Framework

    Get PDF
    Probability theory offers an intuitive and formally sound way to reason in situations that involve uncertainty. The automation of probabilistic reasoning has many applications such as predicting future events or prognostics, providing decision support, action planning under uncertainty, dealing with multiple uncertain measurements, making a diagnosis, and so forth. Bayesian networks in particular have been used to represent probability distributions that model the various applications of uncertainty reasoning. However, present-day automated reasoning approaches involving uncertainty struggle when models increase in size and complexity to fit real-world applications.In this thesis, we explore and extend a state-of-the-art automated reasoning method, called inference by Weighted Model Counting (WMC), when applied to increasingly complex Bayesian network models. WMC is comprised of two distinct phases: compilation and inference. The computational cost of compilation has limited the applicability of WMC. To overcome this limitation we have proposed theoretical and practical solutions that have been tested extensively in empirical studies using real-world Bayesian network models.We have proposed a weighted variant of OBDDs, called Weighted Positive Binary Decision Diagrams (WPBDD), which in turn is based on the new notion of positive Shannon decomposition. WPBDDs are particularly well suited to represent discrete probabilistic models. The conciseness of WPBDDs leads to a reduction in the cost of probabilistic inference.We have introduced Compositional Weighted Model Counting (CWMC), a language-agnostic framework for probabilistic inference that partitions a Bayesian network into subproblems. These subproblems are then compiled and subsequently composed in order to perform inference. This approach significantly reduces the cost of compilation, yet increases the cost of inference. The best results are obtained by seeking a partitioning that allows compilation to (barely) become feasible, but no more, as compilation cost can be amortized over multiple inference queries.Theoretical concepts have been implemented in a readily available open-source tool called ParaGnosis. Further implementational improvements have been found through parallelism, by exploiting independencies that are introduced by CWMC. The proposed methods combined push the boundaries of WMC, allowing this state-of-the-art method to be used on much larger models than before

    The Compilation of Reversible Circuits and a New Optimization Game

    Get PDF
    The focus of this thesis is reversible circuit compilation. We will explore the use of pebble games for circuit analysis. The usefulness of this technique is demonstrated by finding a new space bound for the Karatsuba algorithm and more generally for any similar algorithm based on recurrence relations. A new pebble game based on the reversible pebble game which better captures the use of in-place operations is also presented. We also construct circuit to compute trigonometric functions based on the CORDIC algorithm and analyze it using this game
    corecore