7,095 research outputs found
Markovian Dynamics on Complex Reaction Networks
Complex networks, comprised of individual elements that interact with each
other through reaction channels, are ubiquitous across many scientific and
engineering disciplines. Examples include biochemical, pharmacokinetic,
epidemiological, ecological, social, neural, and multi-agent networks. A common
approach to modeling such networks is by a master equation that governs the
dynamic evolution of the joint probability mass function of the underling
population process and naturally leads to Markovian dynamics for such process.
Due however to the nonlinear nature of most reactions, the computation and
analysis of the resulting stochastic population dynamics is a difficult task.
This review article provides a coherent and comprehensive coverage of recently
developed approaches and methods to tackle this problem. After reviewing a
general framework for modeling Markovian reaction networks and giving specific
examples, the authors present numerical and computational techniques capable of
evaluating or approximating the solution of the master equation, discuss a
recently developed approach for studying the stationary behavior of Markovian
reaction networks using a potential energy landscape perspective, and provide
an introduction to the emerging theory of thermodynamic analysis of such
networks. Three representative problems of opinion formation, transcription
regulation, and neural network dynamics are used as illustrative examples.Comment: 52 pages, 11 figures, for freely available MATLAB software, see
http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.htm
Efficient transfer entropy analysis of non-stationary neural time series
Information theory allows us to investigate information processing in neural
systems in terms of information transfer, storage and modification. Especially
the measure of information transfer, transfer entropy, has seen a dramatic
surge of interest in neuroscience. Estimating transfer entropy from two
processes requires the observation of multiple realizations of these processes
to estimate associated probability density functions. To obtain these
observations, available estimators assume stationarity of processes to allow
pooling of observations over time. This assumption however, is a major obstacle
to the application of these estimators in neuroscience as observed processes
are often non-stationary. As a solution, Gomez-Herrero and colleagues
theoretically showed that the stationarity assumption may be avoided by
estimating transfer entropy from an ensemble of realizations. Such an ensemble
is often readily available in neuroscience experiments in the form of
experimental trials. Thus, in this work we combine the ensemble method with a
recently proposed transfer entropy estimator to make transfer entropy
estimation applicable to non-stationary time series. We present an efficient
implementation of the approach that deals with the increased computational
demand of the ensemble method's practical application. In particular, we use a
massively parallel implementation for a graphics processing unit to handle the
computationally most heavy aspects of the ensemble method. We test the
performance and robustness of our implementation on data from simulated
stochastic processes and demonstrate the method's applicability to
magnetoencephalographic data. While we mainly evaluate the proposed method for
neuroscientific data, we expect it to be applicable in a variety of fields that
are concerned with the analysis of information transfer in complex biological,
social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON
Kinematic Basis of Emergent Energetics of Complex Dynamics
Stochastic kinematic description of a complex dynamics is shown to dictate an
energetic and thermodynamic structure. An energy function emerges
as the limit of the generalized, nonequilibrium free energy of a Markovian
dynamics with vanishing fluctuations. In terms of the and its
orthogonal field , a general vector field
can be decomposed into , where
.
The matrix and scalar , two additional characteristics to the
alone, represent the local geometry and density of states intrinsic to
the statistical motion in the state space at . and
are interpreted as the emergent energy and degeneracy of the motion, with an
energy balance equation ,
reflecting the geometrical . The
partition function employed in statistical mechanics and J. W. Gibbs' method of
ensemble change naturally arise; a fluctuation-dissipation theorem is
established via the two leading-order asymptotics of entropy production as
. The present theory provides a mathematical basis for P. W.
Anderson's emergent behavior in the hierarchical structure of complexity
science.Comment: 7 page
Parametric Sensitivity Analysis for Biochemical Reaction Networks based on Pathwise Information Theory
Stochastic modeling and simulation provide powerful predictive methods for
the intrinsic understanding of fundamental mechanisms in complex biochemical
networks. Typically, such mathematical models involve networks of coupled jump
stochastic processes with a large number of parameters that need to be suitably
calibrated against experimental data. In this direction, the parameter
sensitivity analysis of reaction networks is an essential mathematical and
computational tool, yielding information regarding the robustness and the
identifiability of model parameters. However, existing sensitivity analysis
approaches such as variants of the finite difference method can have an
overwhelming computational cost in models with a high-dimensional parameter
space. We develop a sensitivity analysis methodology suitable for complex
stochastic reaction networks with a large number of parameters. The proposed
approach is based on Information Theory methods and relies on the
quantification of information loss due to parameter perturbations between
time-series distributions. For this reason, we need to work on path-space,
i.e., the set consisting of all stochastic trajectories, hence the proposed
approach is referred to as "pathwise". The pathwise sensitivity analysis method
is realized by employing the rigorously-derived Relative Entropy Rate (RER),
which is directly computable from the propensity functions. A key aspect of the
method is that an associated pathwise Fisher Information Matrix (FIM) is
defined, which in turn constitutes a gradient-free approach to quantifying
parameter sensitivities. The structure of the FIM turns out to be
block-diagonal, revealing hidden parameter dependencies and sensitivities in
reaction networks
- …