8 research outputs found
A generic framework to coarse-grain stochastic reaction networks by Abstract Interpretation
International audienceIn the last decades, logical or discrete models have emerged as a successful paradigm for capturing and predicting the behaviors of systems of molecular interactions. Intuitively, they consist in sampling the abundance of each kind of biochemical entity within finite sets of intervals and deriving transitions accordingly. On one hand, formallyproven sound derivation from more precise descriptions (such as from reaction networks) may include many fictitious behaviors. On the other hand, direct modeling usually favors dominant interactions with no guarantee on the behaviors that are neglected. In this paper, we formalize a sound coarse-graining approach for stochastic reaction networks. Its originality relies on two main ingredients. Firstly, we abstract values by intervals that overlap in order to introduce a minimal effort for the system to go back to the previous interval, hence limiting fictitious oscillations in the coarse-grained models. Secondly, we compute for pairs of transitions (in the coarse-grained model) bounds on the probabilities on which one will occur first. We illustrate our ideas on two case studies and demonstrate how techniques from Abstract Interpretation can be used to design more precise discretization methods, while providing a framework to further investigate the underlying structure of logical and discrete models
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
Formal modelling and approximation-based analysis for mode-switching population dynamics
This thesis explores aspects of model specification and analysis for population dynamics which arise when modelling complex interactions and communication structures in
agent or component collectives. The motivating examples come from the design of
man-made systems where the optimal parametrisations for the behaviours of agents or
components are not known a priori. In particular, we introduce a formal modelling
framework to support the specification of control problems for collective dynamics in
a high-level process algebraic language. A natural choice for the underlying semantics
is to consider continuous time Markov decision processes due to their close relation to
continuous time Markov chains that have traditionally been used as the mathematical
model in numerous high-level modelling languages for stochastic dynamics.
Although the theory of the resulting decision processes has a long history, the
practical considerations, like computation time, present challenges due to the problem
of state space explosion when considering large systems with complex behaviours. State
space explosion problems are especially apparent in formal modelling paradigms where
the specification of models usually happens at a component or an agent level in terms
of a discrete set of states with defined rules for composing the specified behaviours into
the dynamics of a system. Such specifications often give rise to very large models which
are costly to analyse in full detail. However, when analysing models of collectives we
are usually interested in the resulting macro-scale dynamics in terms of some aggregate
measures. With that in mind, the second aspect of analysing collective dynamics that
is considered in this thesis relates to fluid, linear noise and moment closure-based
approximation methods which aim to give a good representation of the macro-scale
dynamics of the models while being computationally less costly to analyse.
We address a class of models where the population structure results from the assumption that components or agents can only be distinguished from each other based on
the state they are in and focus on the particular cases where the population dynamics
can be separated into a discrete set of modes. Our study of these models is motivated
by considering information propagation via broadcast communication where the behaviour of components can change drastically when new information is received from
the rest of the population. We consider existing approximation methods for resulting
stochastic processes and propose a novel approach for applying these methods to models incorporating broadcast communication where each level of information available to
the collective corresponds to a discrete dynamic mode. The resulting approximations
combine continuous dynamics with discrete stochastic jumps and are not immediately
simple to treat numerically. To that end we propose further approximations that allow for a computationally efficient analysis. Finally, we demonstrate how the formal modelling framework in conjunction with the developed approximation methods can
be used for an example in policy synthesis
Recommended from our members
Computational model validation using a novel multiscale multidimensional spatio-temporal meta model checking approach
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonComputational models of complex biological systems can provide a better understanding of how living systems function but need to be validated before they are employed for real-life (e.g. clinical) applications. One of the most frequently employed in silico approaches for validating such models is model checking. Traditional model checking approaches are limited to uniscale non-spatial computational models because they do not explicitly distinguish between different scales, and do not take properties of (emergent) spatial structures (e.g. density of multicellular population) into account. This thesis defines a novel multiscale multidimensional spatio-temporal meta model checking methodology which enables validating multiscale (spatial) computational models of biological systems relative to how both numeric (e.g. concentrations) and spatial system properties are expected to change over time and across multiple scales. The methodology has two important advantages. First it supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to produce them. Secondly the methodology is generic because it can be automatically reconfigured according to case study specific types of spatial structures and properties using the meta model checking approach. In addition the methodology could
be employed for multiple domains of science, but we illustrate its applicability here only against biological case studies. To automate the computational model validation process, the approach was implemented in software tools, which are made freely available online. Their efficacy is illustrated against two uniscale and four multiscale quantitative computational models encoding phase variation in bacterial colonies and the chemotactic aggregation of cells, respectively the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. This novel model checking approach will enable the efficient construction of
reliable multiscale computational models of complex systems.Brunel University Londo
High-Performance Modelling and Simulation for Big Data Applications
This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications
Formal language for statistical inference of uncertain stochastic systems
Stochastic models, in particular Continuous Time Markov Chains, are a commonly
employed mathematical abstraction for describing natural or engineered dynamical
systems. While the theory behind them is well-studied, their specification can be
problematic in a number of ways. Firstly, the size and complexity of the model can
make its description difficult without using a high-level language. Secondly, knowledge
of the system is usually incomplete, leaving one or more parameters with unknown
values, thus impeding further analysis. Sophisticated machine learning algorithms have
been proposed for the statistically rigorous estimation and handling of this uncertainty;
however, their applicability is often limited to systems with finite state-space, and
there has not been any consideration for their use on high-level descriptions. Similarly,
high-level formal languages have been long used for describing and reasoning about
stochastic systems, but require a full specification; efforts to estimate parameters for
such formal models have been limited to simple inference algorithms.
This thesis explores how these two approaches can be brought together, drawing
ideas from the probabilistic programming paradigm. We introduce ProPPA, a process
algebra for the specification of stochastic systems with uncertain parameters. The
language is equipped with a semantics, allowing a formal interpretation of models
written in it. This is the first time that uncertainty has been incorporated into the syntax
and semantics of a formal language, and we describe a new mathematical object capable
of capturing this information. We provide a series of algorithms for inference which can
be automatically applied to ProPPA models without the need to write extra code. As
part of these, we develop a novel inference scheme for infinite-state systems, based on
random truncations of the state-space. The expressive power and inference capabilities
of the framework are demonstrated in a series of small examples as well as a larger-scale
case study. We also present a review of the state-of-the-art in both machine learning
and formal modelling with respect to stochastic systems. We close with a discussion of
potential extensions of this work, and thoughts about different ways in which the fields
of statistical machine learning and formal modelling can be further integrated