52,284 research outputs found
Enhanced Operational Semantics in Systems Biology
We are faced with a great challenge: the cross-fertilization between the fields of formal methods for concurrency, in the computer science domain, and systems biology in the biological realm
Process Calculi Abstractions for Biology
Several approaches have been proposed to model biological systems by means of the formal techniques and tools available in computer science. To mention just a few of them, some representations are inspired by Petri Nets theory, and some other by stochastic processes. A most recent approach consists in interpreting the living entities as terms of process calculi where the behavior of the represented systems can be inferred by applying syntax-driven rules. A comprehensive picture of the state of the art of the process calculi approach to biological modeling is still missing. This paper goes in the direction of providing such a picture by presenting a comparative survey of the process calculi that have been used and proposed to describe the behavior of living entities. This is the preliminary version of a paper that was published in Algorithmic Bioprocesses. The original publication is available at http://www.springer.com/computer/foundations/book/978-3-540-88868-
Design Environments for Complex Systems
The paper describes an approach for modeling complex systems by hiding as much formal details as possible from the user, still allowing verification and simulation of the model. The interface is based on UML to make the environment available to the largest audience. To carry out analysis, verification and simulation we automatically extract process algebras specifications from UML models. The results of the analysis is then reflected back in the UML model by annotating diagrams. The formal model includes stochastic information to handle quantitative parameters. We present here the stochastic -calculus and we discuss the implementation of its probabilistic support that allows simulation of processes. We exploit the benefits of our approach in two applicative domains: global computing and systems biology
Issues about the Adoption of Formal Methods for Dependable Composition of Web Services
Web Services provide interoperable mechanisms for describing, locating and
invoking services over the Internet; composition further enables to build
complex services out of simpler ones for complex B2B applications. While
current studies on these topics are mostly focused - from the technical
viewpoint - on standards and protocols, this paper investigates the adoption of
formal methods, especially for composition. We logically classify and analyze
three different (but interconnected) kinds of important issues towards this
goal, namely foundations, verification and extensions. The aim of this work is
to individuate the proper questions on the adoption of formal methods for
dependable composition of Web Services, not necessarily to find the optimal
answers. Nevertheless, we still try to propose some tentative answers based on
our proposal for a composition calculus, which we hope can animate a proper
discussion
Model checking probabilistic and stochastic extensions of the pi-calculus
We present an implementation of model checking for probabilistic and stochastic extensions of the pi-calculus, a process algebra which supports modelling of concurrency and mobility. Formal verification techniques for such extensions have clear applications in several domains, including mobile ad-hoc network protocols, probabilistic security protocols and biological pathways. Despite this, no implementation of automated verification exists. Building upon the pi-calculus model checker MMC, we first show an automated procedure for constructing the underlying semantic model of a probabilistic or stochastic pi-calculus process. This can then be verified using existing probabilistic model checkers such as PRISM. Secondly, we demonstrate how for processes of a specific structure a more efficient, compositional approach is applicable, which uses our extension of MMC on each parallel component of the system and then translates the results into a high-level modular description for the PRISM tool. The feasibility of our techniques is demonstrated through a number of case studies from the pi-calculus literature
A new tool for the performance analysis of massively parallel computer systems
We present a new tool, GPA, that can generate key performance measures for
very large systems. Based on solving systems of ordinary differential equations
(ODEs), this method of performance analysis is far more scalable than
stochastic simulation. The GPA tool is the first to produce higher moment
analysis from differential equation approximation, which is essential, in many
cases, to obtain an accurate performance prediction. We identify so-called
switch points as the source of error in the ODE approximation. We investigate
the switch point behaviour in several large models and observe that as the
scale of the model is increased, in general the ODE performance prediction
improves in accuracy. In the case of the variance measure, we are able to
justify theoretically that in the limit of model scale, the ODE approximation
can be expected to tend to the actual variance of the model
Simulation of non-Markovian Processes in BlenX
BlenX is a programming language explicitly designed for modeling biological processes inspired by Beta-binders. The actual framework assumes biochemical interactions being exponentially distributed, i.e., an underlying Markov process is associated with BlenX programs. In this paper we relax this condition by providing formal tools for managing non-Markovian processes within BlenX
Flux Analysis in Process Models via Causality
We present an approach for flux analysis in process algebra models of
biological systems. We perceive flux as the flow of resources in stochastic
simulations. We resort to an established correspondence between event
structures, a broadly recognised model of concurrency, and state transitions of
process models, seen as Petri nets. We show that we can this way extract the
causal resource dependencies in simulations between individual state
transitions as partial orders of events. We propose transformations on the
partial orders that provide means for further analysis, and introduce a
software tool, which implements these ideas. By means of an example of a
published model of the Rho GTP-binding proteins, we argue that this approach
can provide the substitute for flux analysis techniques on ordinary
differential equation models within the stochastic setting of process algebras
Process Algebras
Process Algebras are mathematically rigorous languages with well defined semantics that permit describing and verifying properties of concurrent communicating systems.
They can be seen as models of processes, regarded as agents that act and interact continuously with other similar agents and with their common environment. The agents may be real-world objects (even people), or they may be artifacts, embodied perhaps in computer hardware or software systems.
Many different approaches (operational, denotational, algebraic) are taken for describing the meaning of processes. However, the operational approach is the reference one. By relying on the so called Structural Operational Semantics (SOS), labelled transition systems are built and composed by using the different operators of the many different process algebras. Behavioral equivalences are used to abstract from unwanted details and identify those systems that react similarly to external
experiments
- …