7,095 research outputs found

    Markovian Dynamics on Complex Reaction Networks

    Full text link
    Complex networks, comprised of individual elements that interact with each other through reaction channels, are ubiquitous across many scientific and engineering disciplines. Examples include biochemical, pharmacokinetic, epidemiological, ecological, social, neural, and multi-agent networks. A common approach to modeling such networks is by a master equation that governs the dynamic evolution of the joint probability mass function of the underling population process and naturally leads to Markovian dynamics for such process. Due however to the nonlinear nature of most reactions, the computation and analysis of the resulting stochastic population dynamics is a difficult task. This review article provides a coherent and comprehensive coverage of recently developed approaches and methods to tackle this problem. After reviewing a general framework for modeling Markovian reaction networks and giving specific examples, the authors present numerical and computational techniques capable of evaluating or approximating the solution of the master equation, discuss a recently developed approach for studying the stationary behavior of Markovian reaction networks using a potential energy landscape perspective, and provide an introduction to the emerging theory of thermodynamic analysis of such networks. Three representative problems of opinion formation, transcription regulation, and neural network dynamics are used as illustrative examples.Comment: 52 pages, 11 figures, for freely available MATLAB software, see http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.htm

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    Kinematic Basis of Emergent Energetics of Complex Dynamics

    Full text link
    Stochastic kinematic description of a complex dynamics is shown to dictate an energetic and thermodynamic structure. An energy function φ(x)\varphi(x) emerges as the limit of the generalized, nonequilibrium free energy of a Markovian dynamics with vanishing fluctuations. In terms of the ∇φ\nabla\varphi and its orthogonal field γ(x)⊥∇φ\gamma(x)\perp\nabla\varphi, a general vector field b(x)b(x) can be decomposed into −D(x)∇φ+γ-D(x)\nabla\varphi+\gamma, where ∇⋅(ω(x)γ(x))=\nabla\cdot\big(\omega(x)\gamma(x)\big)= −∇ωD(x)∇φ-\nabla\omega D(x)\nabla\varphi. The matrix D(x)D(x) and scalar ω(x)\omega(x), two additional characteristics to the b(x)b(x) alone, represent the local geometry and density of states intrinsic to the statistical motion in the state space at xx. φ(x)\varphi(x) and ω(x)\omega(x) are interpreted as the emergent energy and degeneracy of the motion, with an energy balance equation dφ(x(t))/dt=γD−1γ−bD−1bd\varphi(x(t))/dt=\gamma D^{-1}\gamma-bD^{-1}b, reflecting the geometrical ∥D∇φ∥2+∥γ∥2=∥b∥2\|D\nabla\varphi\|^2+\|\gamma\|^2=\|b\|^2. The partition function employed in statistical mechanics and J. W. Gibbs' method of ensemble change naturally arise; a fluctuation-dissipation theorem is established via the two leading-order asymptotics of entropy production as ϵ→0\epsilon\to 0. The present theory provides a mathematical basis for P. W. Anderson's emergent behavior in the hierarchical structure of complexity science.Comment: 7 page

    Parametric Sensitivity Analysis for Biochemical Reaction Networks based on Pathwise Information Theory

    Full text link
    Stochastic modeling and simulation provide powerful predictive methods for the intrinsic understanding of fundamental mechanisms in complex biochemical networks. Typically, such mathematical models involve networks of coupled jump stochastic processes with a large number of parameters that need to be suitably calibrated against experimental data. In this direction, the parameter sensitivity analysis of reaction networks is an essential mathematical and computational tool, yielding information regarding the robustness and the identifiability of model parameters. However, existing sensitivity analysis approaches such as variants of the finite difference method can have an overwhelming computational cost in models with a high-dimensional parameter space. We develop a sensitivity analysis methodology suitable for complex stochastic reaction networks with a large number of parameters. The proposed approach is based on Information Theory methods and relies on the quantification of information loss due to parameter perturbations between time-series distributions. For this reason, we need to work on path-space, i.e., the set consisting of all stochastic trajectories, hence the proposed approach is referred to as "pathwise". The pathwise sensitivity analysis method is realized by employing the rigorously-derived Relative Entropy Rate (RER), which is directly computable from the propensity functions. A key aspect of the method is that an associated pathwise Fisher Information Matrix (FIM) is defined, which in turn constitutes a gradient-free approach to quantifying parameter sensitivities. The structure of the FIM turns out to be block-diagonal, revealing hidden parameter dependencies and sensitivities in reaction networks
    • …
    corecore