117,609 research outputs found

    Towards modular verification of pathways: fairness and assumptions

    Full text link
    Modular verification is a technique used to face the state explosion problem often encountered in the verification of properties of complex systems such as concurrent interactive systems. The modular approach is based on the observation that properties of interest often concern a rather small portion of the system. As a consequence, reduced models can be constructed which approximate the overall system behaviour thus allowing more efficient verification. Biochemical pathways can be seen as complex concurrent interactive systems. Consequently, verification of their properties is often computationally very expensive and could take advantage of the modular approach. In this paper we report preliminary results on the development of a modular verification framework for biochemical pathways. We view biochemical pathways as concurrent systems of reactions competing for molecular resources. A modular verification technique could be based on reduced models containing only reactions involving molecular resources of interest. For a proper description of the system behaviour we argue that it is essential to consider a suitable notion of fairness, which is a well-established notion in concurrency theory but novel in the field of pathway modelling. We propose a modelling approach that includes fairness and we identify the assumptions under which verification of properties can be done in a modular way. We prove the correctness of the approach and demonstrate it on the model of the EGF receptor-induced MAP kinase cascade by Schoeberl et al.Comment: In Proceedings MeCBIC 2012, arXiv:1211.347

    A safety analysis approach to clinical workflows : application and evaluation

    Get PDF
    Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are required to ensure the safety of these high-risk workflows, because errors that may happen through routine workflow could propagate within the workflow to result in harmful failures of the system’s output. This paper shows how to apply an approach for safety analysis of clinic al workflows to analyse the safety of the workflow within a radiology department and evaluates the approach in terms of usability and benefits. The outcomes of using this approach include identification of the root causes of hazardous workflow failures that may put patients’ lives at risk. We show that the approach is applicable to this area of healthcare and is able to present added value through the detailed information on possible failures, of both their causes and effects; therefore, it has the potential to improve the safety of radiology and other clinical workflows

    Multiple verification in computational modeling of bone pathologies

    Full text link
    We introduce a model checking approach to diagnose the emerging of bone pathologies. The implementation of a new model of bone remodeling in PRISM has led to an interesting characterization of osteoporosis as a defective bone remodeling dynamics with respect to other bone pathologies. Our approach allows to derive three types of model checking-based diagnostic estimators. The first diagnostic measure focuses on the level of bone mineral density, which is currently used in medical practice. In addition, we have introduced a novel diagnostic estimator which uses the full patient clinical record, here simulated using the modeling framework. This estimator detects rapid (months) negative changes in bone mineral density. Independently of the actual bone mineral density, when the decrease occurs rapidly it is important to alarm the patient and monitor him/her more closely to detect insurgence of other bone co-morbidities. A third estimator takes into account the variance of the bone density, which could address the investigation of metabolic syndromes, diabetes and cancer. Our implementation could make use of different logical combinations of these statistical estimators and could incorporate other biomarkers for other systemic co-morbidities (for example diabetes and thalassemia). We are delighted to report that the combination of stochastic modeling with formal methods motivate new diagnostic framework for complex pathologies. In particular our approach takes into consideration important properties of biosystems such as multiscale and self-adaptiveness. The multi-diagnosis could be further expanded, inching towards the complexity of human diseases. Finally, we briefly introduce self-adaptiveness in formal methods which is a key property in the regulative mechanisms of biological systems and well known in other mathematical and engineering areas.Comment: In Proceedings CompMod 2011, arXiv:1109.104

    Model-based dependability analysis : state-of-the-art, challenges and future outlook

    Get PDF
    Abstract: Over the past two decades, the study of model-based dependability analysis has gathered significant research interest. Different approaches have been developed to automate and address various limitations of classical dependability techniques to contend with the increasing complexity and challenges of modern safety-critical system. Two leading paradigms have emerged, one which constructs predictive system failure models from component failure models compositionally using the topology of the system. The other utilizes design models - typically state automata - to explore system behaviour through fault injection. This paper reviews a number of prominent techniques under these two paradigms, and provides an insight into their working mechanism, applicability, strengths and challenges, as well as recent developments within these fields. We also discuss the emerging trends on integrated approaches and advanced analysis capabilities. Lastly, we outline the future outlook for model-based dependability analysis

    Petri nets for systems and synthetic biology

    Get PDF
    We give a description of a Petri net-based framework for modelling and analysing biochemical pathways, which uni¯es the qualita- tive, stochastic and continuous paradigms. Each perspective adds its con- tribution to the understanding of the system, thus the three approaches do not compete, but complement each other. We illustrate our approach by applying it to an extended model of the three stage cascade, which forms the core of the ERK signal transduction pathway. Consequently our focus is on transient behaviour analysis. We demonstrate how quali- tative descriptions are abstractions over stochastic or continuous descrip- tions, and show that the stochastic and continuous models approximate each other. Although our framework is based on Petri nets, it can be applied more widely to other formalisms which are used to model and analyse biochemical networks

    BigraphER: rewriting and analysis engine for bigraphs

    Get PDF
    BigraphER is a suite of open-source tools providing an effi- cient implementation of rewriting, simulation, and visualisation for bigraphs, a universal formalism for modelling interacting systems that evolve in time and space and first introduced by Milner. BigraphER consists of an OCaml library that provides programming interfaces for the manipulation of bigraphs, their constituents and reaction rules, and a command-line tool capable of simulating Bigraphical Reactive Systems (BRSs) and computing their transition systems. Other features are native support for both bigraphs and bigraphs with sharing, stochastic reaction rules, rule priorities, instantiation maps, parameterised controls, predicate checking, graphical output and integration with the probabilistic model checker PRISM

    Complementary approaches to understanding the plant circadian clock

    Get PDF
    Circadian clocks are oscillatory genetic networks that help organisms adapt to the 24-hour day/night cycle. The clock of the green alga Ostreococcus tauri is the simplest plant clock discovered so far. Its many advantages as an experimental system facilitate the testing of computational predictions. We present a model of the Ostreococcus clock in the stochastic process algebra Bio-PEPA and exploit its mapping to different analysis techniques, such as ordinary differential equations, stochastic simulation algorithms and model-checking. The small number of molecules reported for this system tests the limits of the continuous approximation underlying differential equations. We investigate the difference between continuous-deterministic and discrete-stochastic approaches. Stochastic simulation and model-checking allow us to formulate new hypotheses on the system behaviour, such as the presence of self-sustained oscillations in single cells under constant light conditions. We investigate how to model the timing of dawn and dusk in the context of model-checking, which we use to compute how the probability distributions of key biochemical species change over time. These show that the relative variation in expression level is smallest at the time of peak expression, making peak time an optimal experimental phase marker. Building on these analyses, we use approaches from evolutionary systems biology to investigate how changes in the rate of mRNA degradation impacts the phase of a key protein likely to affect fitness. We explore how robust this circadian clock is towards such potential mutational changes in its underlying biochemistry. Our work shows that multiple approaches lead to a more complete understanding of the clock

    Experimental Biological Protocols with Formal Semantics

    Full text link
    Both experimental and computational biology is becoming increasingly automated. Laboratory experiments are now performed automatically on high-throughput machinery, while computational models are synthesized or inferred automatically from data. However, integration between automated tasks in the process of biological discovery is still lacking, largely due to incompatible or missing formal representations. While theories are expressed formally as computational models, existing languages for encoding and automating experimental protocols often lack formal semantics. This makes it challenging to extract novel understanding by identifying when theory and experimental evidence disagree due to errors in the models or the protocols used to validate them. To address this, we formalize the syntax of a core protocol language, which provides a unified description for the models of biochemical systems being experimented on, together with the discrete events representing the liquid-handling steps of biological protocols. We present both a deterministic and a stochastic semantics to this language, both defined in terms of hybrid processes. In particular, the stochastic semantics captures uncertainties in equipment tolerances, making it a suitable tool for both experimental and computational biologists. We illustrate how the proposed protocol language can be used for automated verification and synthesis of laboratory experiments on case studies from the fields of chemistry and molecular programming
    corecore