12 research outputs found

    Generating and Solving Symbolic Parity Games

    Get PDF
    We present a new tool for verification of modal mu-calculus formulae for process specifications, based on symbolic parity games. It enhances an existing method, that first encodes the problem to a Parameterised Boolean Equation System (PBES) and then instantiates the PBES to a parity game. We improved the translation from specification to PBES to preserve the structure of the specification in the PBES, we extended LTSmin to instantiate PBESs to symbolic parity games, and implemented the recursive parity game solving algorithm by Zielonka for symbolic parity games. We use Multi-valued Decision Diagrams (MDDs) to represent sets and relations, thus enabling the tools to deal with very large systems. The transition relation is partitioned based on the structure of the specification, which allows for efficient manipulation of the MDDs. We performed two case studies on modular specifications, that demonstrate that the new method has better time and memory performance than existing PBES based tools and can be faster (but slightly less memory efficient) than the symbolic model checker NuSMV.Comment: In Proceedings GRAPHITE 2014, arXiv:1407.767

    Analysing the Control Software of the Compact Muon Solenoid Experiment at the Large Hadron Collider

    Full text link
    The control software of the CERN Compact Muon Solenoid experiment contains over 30,000 finite state machines. These state machines are organised hierarchically: commands are sent down the hierarchy and state changes are sent upwards. The sheer size of the system makes it virtually impossible to fully understand the details of its behaviour at the macro level. This is fuelled by unclarities that already exist at the micro level. We have solved the latter problem by formally describing the finite state machines in the mCRL2 process algebra. The translation has been implemented using the ASF+SDF meta-environment, and its correctness was assessed by means of simulations and visualisations of individual finite state machines and through formal verification of subsystems of the control software. Based on the formalised semantics of the finite state machines, we have developed dedicated tooling for checking properties that can be verified on finite state machines in isolation.Comment: To appear in FSEN'11. Extended version with details of the ASF+SDF translation of SML into mCRL

    Model Checking a C++ Software Framework, a Case Study

    Get PDF
    This paper presents a case study on applying two model checkers, SPIN and DIVINE, to verify key properties of a C++ software framework, known as ADAPRO, originally developed at CERN. SPIN was used for verifying properties on the design level. DIVINE was used for verifying simple test applications that interacted with the implementation. Both model checkers were found to have their own respective sets of pros and cons, but the overall experience was positive. Because both model checkers were used in a complementary manner, they provided valuable new insights into the framework, which would arguably have been hard to gain by traditional testing and analysis tools only. Translating the C++ source code into the modeling language of the SPIN model checker helped to find flaws in the original design. With DIVINE, defects were found in parts of the code base that had already been subject to hundreds of hours of unit tests, integration tests, and acceptance tests. Most importantly, model checking was found to be easy to integrate into the workflow of the software project and bring added value, not only as verification, but also validation methodology. Therefore, using model checking for developing library-level code seems realistic and worth the effort.Comment: In Proceedings of the 27th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE '19), August 26-30, 2019, Tallinn, Estonia. ACM, New York, NY, USA, 11 page

    Modelling a Distributed Data Acquisition System

    Get PDF
    This thesis discusses the formal modelling and verification of certain non-real-time aspects of correctness of a mission-critical distributed software system known as the ALICE Data Point Service (ADAPOS). The domain of this distributed system is data acquisition from a particle detector control system in experimental high energy particle physics research. ADAPOS is part of the upgrade effort of A Large Ion Collider Experiment (ALICE) at the European Organisation for Nuclear Research (CERN), near Geneva in France/Switzerland, for the third run of the Large Hadron Collider (LHC). ADAPOS is based on the publicly available ALICE Data Point Processing (ADAPRO) C++14 framework and works within the free and open source GNU/Linux ecosystem. The model checker Spin was chosen for modelling and verifying ADAPOS. The model focuses on the general specification of ADAPOS. It includes ADAPOS processes, a load generator process, and rudimentary interpretations for the network protocols used between the processes. For experimenting with different interpretations of the underlying network protocols and also for coping with the state space explosion problem, eight variants of the model were developed and studied. Nine Linear Temporal Logic (LTL) properties were defined for all those variants. Large numbers of states were covered during model checking even though the model turned out to have a reachable state space too large to fully exhaust. No counter-examples were found to safety properties. A significant amount of evidence hinting that ADAPOS seems to be safe, was obtained. Liveness properties and implementation-level verification among other possible research directions remain open

    The Integration of Product Data with Workflow Management Systems Through a Common Data Model

    Get PDF
    Traditionally product models, and their definitions, have been handled separately from process models and their definitions. In industry, each has been managed by database systems defined for their specific domain, e.g. Product Data Management (PDM) for product definitions and Workflow Management System (WfM) for process definitions. There is little or no overlap between these two views of systems even though product and process information interact over the complete life cycle from design to production..

    Advanced reduction techniques for model checking

    Get PDF
    corecore