10 research outputs found

    Modularity and Petri Nets

    No full text
    The systems to model are nowadays very large. Their specification is often decomposed into several steps. This leads to modularly or incrementally designed models. Petri nets analysis is generally achieved via state space analysis, which is often impossible to perform due to the so-called state space explosion problem. Several methods allow to reduce the occurrence graph size, e.g. using partial orders, symmetries, ... Here, we focus on techniques which take advantage of the modular design of the system, and hence build the state space in a modular or incremental way

    An incremental modular technique for checking LTL-X properties on Petri nets

    Get PDF
    Model-checking is a powerful and widespread technique for the verification of finite state concurrent systems. However, the main hindrance for wider application of this technique is the well-known state explosion problem. Modular verification is a promising natural approach to tackle this problem. It is based on the "divide and conquer" principle and aims at deducing the properties of the system from those of its components analysed in isolation. Unfortunately, several issues make the use of modular verification techniques difficult in practice. First, deciding how to partition the system into components is not trivial and can have a significant impact on the resources needed for verification. Second, when model-checking a component in isolation, how should the environment of this component be described? In this paper, we address these problems in the framework of model-checking LTL\X action-based properties on Petri nets. We propose an incremental and modular verification approach where the system model is partitioned according to the actions occurring in the property to be verified and where the environment of a component is taken into account using the linear place invariants of the system

    Model Checking of Stream Processing Pipelines

    Get PDF
    Event stream processing (ESP) is the application of a computation to a set of input sequences of arbitrary data objects, called "events", in order to produce other sequences of data objects. In recent years, a large number of ESP systems have been developed; however, none of them is easily amenable to a formal verification of properties on their execution. In this paper, we show how stream processing pipelines built with an existing ESP library called BeepBeep 3 can be exported as a Kripke structure for the NuXmv model checker. This makes it possible to formally verify properties on these pipelines, and opens the way to the use of such pipelines directly within a model checker as an extension of its specification language

    Safety, Liveness and Run-time Refinement for Modular Process-Aware Information Systems with Dynamic Sub Processes

    Get PDF
    We study modularity, run-time adaptation and refinement under safety and liveness constraints in event-based process models with dynamic sub-process instantiation. The study is part of a larger pro-gramme to provide semantically well-founded technologies for modelling, implementation and verification of flexible, run-time adaptable process-aware information systems, moved into practice via the Dynamic Condi-tion Response (DCR) Graphs notation co-developed with our industrial partner. Our key contributions are: (1) A formal theory of dynamic sub-process instantiation for declarative, event-based processes under safety and liveness constraints, given as the DCR * process language, equipped with a compositional operational semantics and conservatively extending the DCR Graphs notation; (2) an expressiveness analysis revealing that the DCR * process language is Turing-complete, while the fragment cor-responding to DCR Graphs (without dynamic sub-process instantiation) characterises exactly the languages that are the union of a regular and an omega-regular language; (3) a formalisation of run-time refinement and adaptation by composition for DCR * processes and a proof that such re-finement is undecidable in general; and finally (4) a decidable and practi-cally useful sub-class of run-time refinements. Our results are illustrated by a running example inspired by a recent Electronic Case Management solution based on DCR Graphs and delivered by our industrial partner. An online prototype implementation of the DCR * language (including examples from the paper) and its visualisation as DCR Graphs can be found a

    Model checking sur des pipelines de stream processing

    Get PDF
    L’event stream processing (ESP) est le traitement d’un flux continu d’objets, appelé séquence d’événements, dans l’optique de l’analyser ou de le transformer. Le laboratoire d’informatique formelle de l’UQAC (LIF) développe depuis plusieurs années un moteur de stream processing open source appelé BeepBeep 3. Cet engin permet une utilisation facile du concept d’ESP. À BeepBeep 3, on y a intégré un système de vérification formelle. Avec seulement quelques lignes de code supplémentaires, il est maintenant possible de générer automatiquement une structure de Kripke d’une chaîne de processeurs donnée. Des applications intéressantes s’ajoutent donc à l’utilité déjà vaste de BeepBeep. Comparer des pipelines à l’aide de formules en logique temporelle linéaire (LTL) ou en logique du temps arborescent (CTL) et l’idée qu’une chaîne de processeurs monitore une structure de Kripke ne sont que quelques exemples. Dans ce mémoire, on expliquera tout le processus de réflexion et d’exécution qui a mené à l’automatisation de la construction d’une chaîne de processeur BeepBeep en un modèle de Kripke valide pour analyse dans le logiciel NuXMV

    LTL Model Checking for Modular Petri Nets

    No full text
    We consider the problem of model checking modular Petri nets for the linear time logic LTL-X. An algorithm is presented which can use the synchronisation graph from modular analysis as presented by Christensen and Petrucci and perform LTL-X model checking. We have implemented our method in the reachability analyser Maria and performed experiments. As is the case for modular analysis in general, in some cases the gains can be considerable while in other cases the gain is negligible

    28th International Symposium on Temporal Representation and Reasoning (TIME 2021)

    Get PDF
    The 28th International Symposium on Temporal Representation and Reasoning (TIME 2021) was planned to take place in Klagenfurt, Austria, but had to move to an online conference due to the insecurities and restrictions caused by the pandemic. Since its frst edition in 1994, TIME Symposium is quite unique in the panorama of the scientifc conferences as its main goal is to bring together researchers from distinct research areas involving the management and representation of temporal data as well as the reasoning about temporal aspects of information. Moreover, TIME Symposium aims to bridge theoretical and applied research, as well as to serve as an interdisciplinary forum for exchange among researchers from the areas of artifcial intelligence, database management, logic and verifcation, and beyond

    Qualitatively modelling genetic regulatory networks : Petri net techniques and tools

    Get PDF
    The development of post-genomic technologies has led to a paradigm shift in the way we study genetic regulatory networks (GRNs) - the underlying systems which mediate cell function. To complement this, the focus is on devising scalable, unambiguous and automated formal techniques for holistically modelling and analysing these complex systems. Quantitative approaches offer one possible solution, but do not appear to be commensurate with currently available data. This motivates qualitative approaches such as Boolean networks (BNs) , which abstractly model the system without requiring such a high level of data completeness. Qualitative approaches enable fundamental dynamical properties to be studied, and are well-suited to initial investigations. However, strengthened formal techniques and tool support are required if they are to meet the demands of the biological community. This thesis aims to investigate, develop and evaluate the application of Petri nets (PNs) for qualitatively modelling and analysing GRNs. PNs are well-established in the field of computer science, and enjoy a number of attractive benefits, such a wide range of techniques and tools, which make them ideal for studying biological systems. We take an existing qualitative PN approach for modelling GRNs based on BNs, and extend it to more general models based on multi-valued networks (MVNs). Importantly, we develop tool support to automate model construction. We illustrate our approach with two detailed case studies on Boolean models for carbon stress in Escherichia coli and sporulation in Bacillus subtilis, and then consider a multi-valued model of the former. These case studies explore the analysis power of PN s by exploiting a range of techniques and tools. A number of behavioural differences are identified between the two E. coli models which lead us to question their formal relationship. We investigate this by proposing a framework for reasoning about the behaviour of MVNs at different levels of abstraction. We develop tool support for practical models, and show a number of important results which motivate the need for multi-valued modelling. Asynchronous BN s can be seen to be more biologically realistic than their synchronous counterparts. However, they have the drawback of capturing behaviour which is unrealisable in practice. We propose a novel approach for refining such behaviour using signal transition graphs, a PN formalism from asynchronous circuit design. We automate our approach, and demonstrate it using a BN of the lysis-lysogeny switch in phage A. Our results show that a more realistic asynchronous model can be derived which preserves the stochastic switch.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Flexible Process Notations for Cross-organizational Case Management Systems

    Get PDF
    In recent times western economies have become increasingly focussed on knowl-edge work. Knowledge work processes depend heavily on the expert knowledge of workers and therefore tend to require more flexibility then the processes seen in traditional production work. Over-constrained processes cause frustration and inefficiency because they do not allow workers to use their expert experience to make the best judgements on how to solve the unique challenges they are faced with. However some structuring of their work is still required to en-sure that laws and business rules are being followed. IT Systems for process control have a large role to play in structuring and organizing such processes, however most of these systems have been developed with a focus on produc-tion work and fail to support the more flexible processes required by knowledge workers. The problem arises at the core of these systems: the notations in which the processes are defined. Traditional process notations are flow-based: control of the process flows from one activity to the next. This paradigm in
    corecore