731 research outputs found
A variational method for analyzing stochastic limit cycle oscillators
We introduce a variational method for analyzing limit cycle oscillators in
driven by Gaussian noise. This allows us to derive exact
stochastic differential equations (SDEs) for the amplitude and phase of the
solution, which are accurate over times over order
, where is the amplitude of the noise
and the magnitude of decay of transverse fluctuations. Within the
variational framework, different choices of the amplitude-phase decomposition
correspond to different choices of the inner product space . For
concreteness, we take a weighted Euclidean norm, so that the minimization
scheme determines the phase by projecting the full solution on to the limit
cycle using Floquet vectors. Since there is coupling between the amplitude and
phase equations, even in the weak noise limit, there is a small but non-zero
probability of a rare event in which the stochastic trajectory makes a large
excursion away from a neighborhood of the limit cycle. We use the amplitude and
phase equations to bound the probability of it doing this: finding that the
typical time the system takes to leave a neighborhood of the oscillator scales
as
Asymptotic description of stochastic neural networks. I - existence of a Large Deviation Principle
We study the asymptotic law of a network of interacting neurons when the
number of neurons becomes infinite. The dynamics of the neurons is described by
a set of stochastic differential equations in discrete time. The neurons
interact through the synaptic weights which are Gaussian correlated random
variables. We describe the asymptotic law of the network when the number of
neurons goes to infinity. Unlike previous works which made the biologically
unrealistic assumption that the weights were i.i.d. random variables, we assume
that they are correlated. We introduce the process-level empirical measure of
the trajectories of the solutions to the equations of the finite network of
neurons and the averaged law (with respect to the synaptic weights) of the
trajectories of the solutions to the equations of the network of neurons. The
result is that the image law through the empirical measure satisfies a large
deviation principle with a good rate function. We provide an analytical
expression of this rate function in terms of the spectral representation of
certain Gaussian processes
Large Deviations of a Spatially-Stationary Network of Interacting Neurons
In this work we determine a process-level Large Deviation Principle (LDP) for
a model of interacting neurons indexed by a lattice . The neurons
are subject to noise, which is modelled as a correlated martingale. The
probability law governing the noise is strictly stationary, and we are
therefore able to find a LDP for the probability laws governing the
stationary empirical measure generated by the neurons in a cube
of length . We use this LDP to determine an LDP for the neural network
model. The connection weights between the neurons evolve according to a
learning rule / neuronal plasticity, and these results are adaptable to a large
variety of neural network models. This LDP is of great use in the mathematical
modelling of neural networks, because it allows a quantification of the
likelihood of the system deviating from its limit, and also a determination of
which direction the system is likely to deviate. The work is also of interest
because there are nontrivial correlations between the neurons even in the
asymptotic limit, thereby presenting itself as a generalisation of traditional
mean-field models
Euler Characteristic in Odd Dimensions
It is well known that the Euler characteristic of an odd dimensional compact
manifold is zero. An Euler complex is a combinatorial analogue of a compact
manifold. We present here an elementary proof of the corresponding result for
Euler complexes
Synchronization of stochastic hybrid oscillators driven by a common switching environment
Many systems in biology, physics and chemistry can be modeled through
ordinary differential equations, which are piecewise smooth, but switch between
different states according to a Markov jump process. In the fast switching
limit, the dynamics converges to a deterministic ODE. In this paper we suppose
that this limit ODE supports a stable limit cycle. We demonstrate that a set of
such oscillators can synchronize when they are uncoupled, but they share the
same switching Markov jump process. The latter is taken to represent the effect
of a common randomly switching environment. We determine the leading order of
the Lyapunov coefficient governing the rate of decay of the phase difference in
the fast switching limit. The analysis bears some similarities to the classical
analysis of synchronization of stochastic oscillators subject to common white
noise. However the discrete nature of the Markov jump process raises some
difficulties: in fact we find that the Lyapunov coefficient from the
quasi-steady-state approximation differs from the Lyapunov coefficient one
obtains from a second order perturbation expansion in the waiting time between
jumps. Finally, we demonstrate synchronization numerically in the radial
isochron clock model and show that the latter Lyapinov exponent is more
accurate
Asymptotic description of stochastic neural networks. II - Characterization of the limit law
We continue the development, started in of the asymptotic description of
certain stochastic neural networks. We use the Large Deviation Principle (LDP)
and the good rate function H announced there to prove that H has a unique
minimum mu_e, a stationary measure on the set of trajectories. We characterize
this measure by its two marginals, at time 0, and from time 1 to T. The second
marginal is a stationary Gaussian measure. With an eye on applications, we show
that its mean and covariance operator can be inductively computed. Finally we
use the LDP to establish various convergence results, averaged and quenched
- …