5,839 research outputs found

    Jump Markov models and transition state theory: the Quasi-Stationary Distribution approach

    Full text link
    We are interested in the connection between a metastable continuous state space Markov process (satisfying e.g. the Langevin or overdamped Langevin equation) and a jump Markov process in a discrete state space. More precisely, we use the notion of quasi-stationary distribution within a metastable state for the continuous state space Markov process to parametrize the exit event from the state. This approach is useful to analyze and justify methods which use the jump Markov process underlying a metastable dynamics as a support to efficiently sample the state-to-state dynamics (accelerated dynamics techniques). Moreover, it is possible by this approach to quantify the error on the exit event when the parametrization of the jump Markov model is based on the Eyring-Kramers formula. This therefore provides a mathematical framework to justify the use of transition state theory and the Eyring-Kramers formula to build kinetic Monte Carlo or Markov state models.Comment: 14 page

    Fluid limit theorems for stochastic hybrid systems with application to neuron models

    Full text link
    This paper establishes limit theorems for a class of stochastic hybrid systems (continuous deterministic dynamic coupled with jump Markov processes) in the fluid limit (small jumps at high frequency), thus extending known results for jump Markov processes. We prove a functional law of large numbers with exponential convergence speed, derive a diffusion approximation and establish a functional central limit theorem. We apply these results to neuron models with stochastic ion channels, as the number of channels goes to infinity, estimating the convergence to the deterministic model. In terms of neural coding, we apply our central limit theorems to estimate numerically impact of channel noise both on frequency and spike timing coding.Comment: 42 pages, 4 figure

    Donsker-Varadhan asymptotics for degenerate jump Markov processes

    Full text link
    We consider a class of continuous time Markov chains on a compact metric space that admit an invariant measure strictly positive on open sets together with absorbing states. We prove the joint large deviation principle for the empirical measure and flow. Due to the lack of uniform ergodicity, the zero level set of the rate function is not a singleton. As corollaries, we obtain the Donsker-Varadhan rate function for the empirical measure and a variational expression of the rate function for the empirical flow

    Occupation Times for Jump Processes

    Full text link
    We consider a class of pure jump Markov processes in \rr^d whose jump kernels are comparable to those of symmetric stable processes. We prove a support theorem, a lower bound on the occupation times of sets, and show that we can approximate resolvents using smooth functions

    Metastability in a stochastic neural network modeled as a velocity jump Markov process

    Get PDF
    One of the major challenges in neuroscience is to determine how noise that is present at the molecular and cellular levels affects dynamics and information processing at the macroscopic level of synaptically coupled neuronal populations. Often noise is incorprated into deterministic network models using extrinsic noise sources. An alternative approach is to assume that noise arises intrinsically as a collective population effect, which has led to a master equation formulation of stochastic neural networks. In this paper we extend the master equation formulation by introducing a stochastic model of neural population dynamics in the form of a velocity jump Markov process. The latter has the advantage of keeping track of synaptic processing as well as spiking activity, and reduces to the neural master equation in a particular limit. The population synaptic variables evolve according to piecewise deterministic dynamics, which depends on population spiking activity. The latter is characterised by a set of discrete stochastic variables evolving according to a jump Markov process, with transition rates that depend on the synaptic variables. We consider the particular problem of rare transitions between metastable states of a network operating in a bistable regime in the deterministic limit. Assuming that the synaptic dynamics is much slower than the transitions between discrete spiking states, we use a WKB approximation and singular perturbation theory to determine the mean first passage time to cross the separatrix between the two metastable states. Such an analysis can also be applied to other velocity jump Markov processes, including stochastic voltage-gated ion channels and stochastic gene networks
    • …
    corecore