9,092 research outputs found
Data-driven modelling of biological multi-scale processes
Biological processes involve a variety of spatial and temporal scales. A
holistic understanding of many biological processes therefore requires
multi-scale models which capture the relevant properties on all these scales.
In this manuscript we review mathematical modelling approaches used to describe
the individual spatial scales and how they are integrated into holistic models.
We discuss the relation between spatial and temporal scales and the implication
of that on multi-scale modelling. Based upon this overview over
state-of-the-art modelling approaches, we formulate key challenges in
mathematical and computational modelling of biological multi-scale and
multi-physics processes. In particular, we considered the availability of
analysis tools for multi-scale models and model-based multi-scale data
integration. We provide a compact review of methods for model-based data
integration and model-based hypothesis testing. Furthermore, novel approaches
and recent trends are discussed, including computation time reduction using
reduced order and surrogate models, which contribute to the solution of
inference problems. We conclude the manuscript by providing a few ideas for the
development of tailored multi-scale inference methods.Comment: This manuscript will appear in the Journal of Coupled Systems and
Multiscale Dynamics (American Scientific Publishers
A Mathematical Framework for Agent Based Models of Complex Biological Networks
Agent-based modeling and simulation is a useful method to study biological
phenomena in a wide range of fields, from molecular biology to ecology. Since
there is currently no agreed-upon standard way to specify such models it is not
always easy to use published models. Also, since model descriptions are not
usually given in mathematical terms, it is difficult to bring mathematical
analysis tools to bear, so that models are typically studied through
simulation. In order to address this issue, Grimm et al. proposed a protocol
for model specification, the so-called ODD protocol, which provides a standard
way to describe models. This paper proposes an addition to the ODD protocol
which allows the description of an agent-based model as a dynamical system,
which provides access to computational and theoretical tools for its analysis.
The mathematical framework is that of algebraic models, that is, time-discrete
dynamical systems with algebraic structure. It is shown by way of several
examples how this mathematical specification can help with model analysis.Comment: To appear in Bulletin of Mathematical Biolog
Reinforcement Learning: A Survey
This paper surveys the field of reinforcement learning from a
computer-science perspective. It is written to be accessible to researchers
familiar with machine learning. Both the historical basis of the field and a
broad selection of current work are summarized. Reinforcement learning is the
problem faced by an agent that learns behavior through trial-and-error
interactions with a dynamic environment. The work described here has a
resemblance to work in psychology, but differs considerably in the details and
in the use of the word ``reinforcement.'' The paper discusses central issues of
reinforcement learning, including trading off exploration and exploitation,
establishing the foundations of the field via Markov decision theory, learning
from delayed reinforcement, constructing empirical models to accelerate
learning, making use of generalization and hierarchy, and coping with hidden
state. It concludes with a survey of some implemented systems and an assessment
of the practical utility of current methods for reinforcement learning.Comment: See http://www.jair.org/ for any accompanying file
Group Testing with Probabilistic Tests: Theory, Design and Application
Identification of defective members of large populations has been widely
studied in the statistics community under the name of group testing. It
involves grouping subsets of items into different pools and detecting defective
members based on the set of test results obtained for each pool.
In a classical noiseless group testing setup, it is assumed that the sampling
procedure is fully known to the reconstruction algorithm, in the sense that the
existence of a defective member in a pool results in the test outcome of that
pool to be positive. However, this may not be always a valid assumption in some
cases of interest. In particular, we consider the case where the defective
items in a pool can become independently inactive with a certain probability.
Hence, one may obtain a negative test result in a pool despite containing some
defective items. As a result, any sampling and reconstruction method should be
able to cope with two different types of uncertainty, i.e., the unknown set of
defective items and the partially unknown, probabilistic testing procedure.
In this work, motivated by the application of detecting infected people in
viral epidemics, we design non-adaptive sampling procedures that allow
successful identification of the defective items through a set of probabilistic
tests. Our design requires only a small number of tests to single out the
defective items. In particular, for a population of size and at most
defective items with activation probability , our results show that tests is sufficient if the sampling procedure should
work for all possible sets of defective items, while
tests is enough to be successful for any single set of defective items.
Moreover, we show that the defective members can be recovered using a simple
reconstruction algorithm with complexity of .Comment: Full version of the conference paper "Compressed Sensing with
Probabilistic Measurements: A Group Testing Solution" appearing in
proceedings of the 47th Annual Allerton Conference on Communication, Control,
and Computing, 2009 (arXiv:0909.3508). To appear in IEEE Transactions on
Information Theor
Robust Stochastic Chemical Reaction Networks and Bounded Tau-Leaping
The behavior of some stochastic chemical reaction networks is largely unaffected by slight inaccuracies in reaction rates. We formalize the robustness of state probabilities to reaction rate deviations, and describe a formal connection between robustness and efficiency of simulation. Without robustness guarantees, stochastic simulation seems to require computational time proportional to the total number of reaction events. Even if the concentration (molecular count per volume) stays bounded, the number of reaction events can be linear in the duration of simulated time and total molecular count. We show that the behavior of robust systems can be predicted such that the computational work scales linearly with the duration of simulated time and concentration, and only polylogarithmically in the total molecular count. Thus our asymptotic analysis captures the dramatic speedup when molecular counts are large, and shows that for bounded concentrations the computation time is essentially invariant with molecular count. Finally, by noticing that even robust stochastic chemical reaction networks are capable of embedding complex computational problems, we argue that the linear dependence on simulated time and concentration is likely optimal
- …