18,383 research outputs found
Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions
In this manuscript we analyze the collective behavior of mean-field limits of
large-scale, spatially extended stochastic neuronal networks with delays.
Rigorously, the asymptotic regime of such systems is characterized by a very
intricate stochastic delayed integro-differential McKean-Vlasov equation that
remain impenetrable, leaving the stochastic collective dynamics of such
networks poorly understood. In order to study these macroscopic dynamics, we
analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics
and sigmoidal interactions. In that case, we prove that the solution of the
mean-field equation is Gaussian, hence characterized by its two first moments,
and that these two quantities satisfy a set of coupled delayed
integro-differential equations. These equations are similar to usual neural
field equations, and incorporate noise levels as a parameter, allowing analysis
of noise-induced transitions. We identify through bifurcation analysis several
qualitative transitions due to noise in the mean-field limit. In particular,
stabilization of spatially homogeneous solutions, synchronized oscillations,
bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from
static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow
further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence
in space of Brownian motion
Linear stability analysis of retrieval state in associative memory neural networks of spiking neurons
We study associative memory neural networks of the Hodgkin-Huxley type of
spiking neurons in which multiple periodic spatio-temporal patterns of spike
timing are memorized as limit-cycle-type attractors. In encoding the
spatio-temporal patterns, we assume the spike-timing-dependent synaptic
plasticity with the asymmetric time window. Analysis for periodic solution of
retrieval state reveals that if the area of the negative part of the time
window is equivalent to the positive part, then crosstalk among encoded
patterns vanishes. Phase transition due to the loss of the stability of
periodic solution is observed when we assume fast alpha-function for direct
interaction among neurons. In order to evaluate the critical point of this
phase transition, we employ Floquet theory in which the stability problem of
the infinite number of spiking neurons interacting with alpha-function is
reduced into the eigenvalue problem with the finite size of matrix. Numerical
integration of the single-body dynamics yields the explicit value of the
matrix, which enables us to determine the critical point of the phase
transition with a high degree of precision.Comment: Accepted for publication in Phys. Rev.
Desynchronization in diluted neural networks
The dynamical behaviour of a weakly diluted fully-inhibitory network of
pulse-coupled spiking neurons is investigated. Upon increasing the coupling
strength, a transition from regular to stochastic-like regime is observed. In
the weak-coupling phase, a periodic dynamics is rapidly approached, with all
neurons firing with the same rate and mutually phase-locked. The
strong-coupling phase is characterized by an irregular pattern, even though the
maximum Lyapunov exponent is negative. The paradox is solved by drawing an
analogy with the phenomenon of ``stable chaos'', i.e. by observing that the
stochastic-like behaviour is "limited" to a an exponentially long (with the
system size) transient. Remarkably, the transient dynamics turns out to be
stationary.Comment: 11 pages, 13 figures, submitted to Phys. Rev.
Hopf Bifurcation and Chaos in Tabu Learning Neuron Models
In this paper, we consider the nonlinear dynamical behaviors of some tabu
leaning neuron models. We first consider a tabu learning single neuron model.
By choosing the memory decay rate as a bifurcation parameter, we prove that
Hopf bifurcation occurs in the neuron. The stability of the bifurcating
periodic solutions and the direction of the Hopf bifurcation are determined by
applying the normal form theory. We give a numerical example to verify the
theoretical analysis. Then, we demonstrate the chaotic behavior in such a
neuron with sinusoidal external input, via computer simulations. Finally, we
study the chaotic behaviors in tabu learning two-neuron models, with linear and
quadratic proximity functions respectively.Comment: 14 pages, 13 figures, Accepted by International Journal of
Bifurcation and Chao
Emergence of Synchronous Oscillations in Neural Networks Excited by Noise
The presence of noise in non linear dynamical systems can play a constructive
role, increasing the degree of order and coherence or evoking improvements in
the performance of the system. An example of this positive influence in a
biological system is the impulse transmission in neurons and the
synchronization of a neural network. Integrating numerically the Fokker-Planck
equation we show a self-induced synchronized oscillation. Such an oscillatory
state appears in a neural network coupled with a feedback term, when this
system is excited by noise and the noise strength is within a certain range.Comment: 12 pages, 18 figure
Limits and dynamics of stochastic neuronal networks with random heterogeneous delays
Realistic networks display heterogeneous transmission delays. We analyze here
the limits of large stochastic multi-populations networks with stochastic
coupling and random interconnection delays. We show that depending on the
nature of the delays distributions, a quenched or averaged propagation of chaos
takes place in these networks, and that the network equations converge towards
a delayed McKean-Vlasov equation with distributed delays. Our approach is
mostly fitted to neuroscience applications. We instantiate in particular a
classical neuronal model, the Wilson and Cowan system, and show that the
obtained limit equations have Gaussian solutions whose mean and standard
deviation satisfy a closed set of coupled delay differential equations in which
the distribution of delays and the noise levels appear as parameters. This
allows to uncover precisely the effects of noise, delays and coupling on the
dynamics of such heterogeneous networks, in particular their role in the
emergence of synchronized oscillations. We show in several examples that not
only the averaged delay, but also the dispersion, govern the dynamics of such
networks.Comment: Corrected misprint (useless stopping time) in proof of Lemma 1 and
clarified a regularity hypothesis (remark 1
Phase-locking in weakly heterogeneous neuronal networks
We examine analytically the existence and stability of phase-locked states in
a weakly heterogeneous neuronal network. We consider a model of N neurons with
all-to-all synaptic coupling where the heterogeneity is in the firing frequency
or intrinsic drive of the neurons. We consider both inhibitory and excitatory
coupling. We derive the conditions under which stable phase-locking is
possible. In homogeneous networks, many different periodic phase-locked states
are possible. Their stability depends on the dynamics of the neuron and the
coupling. For weak heterogeneity, the phase-locked states are perturbed from
the homogeneous states and can remain stable if their homogeneous conterparts
are stable. For enough heterogeneity, phase-locked solutions either lose
stability or are destroyed completely. We analyze the possible states the
network can take when phase-locking is broken.Comment: RevTex, 27 pages, 3 figure
Collective Almost Synchronization in Complex Networks
This work introduces the phenomenon of Collective Almost Synchronization
(CAS), which describes a universal way of how patterns can appear in complex
networks even for small coupling strengths. The CAS phenomenon appears due to
the existence of an approximately constant local mean field and is
characterized by having nodes with trajectories evolving around periodic stable
orbits. Common notion based on statistical knowledge would lead one to
interpret the appearance of a local constant mean field as a consequence of the
fact that the behavior of each node is not correlated to the behaviors of the
others. Contrary to this common notion, we show that various well known weaker
forms of synchronization (almost, time-lag, phase synchronization, and
generalized synchronization) appear as a result of the onset of an almost
constant local mean field. If the memory is formed in a brain by minimising the
coupling strength among neurons and maximising the number of possible patterns,
then the CAS phenomenon is a plausible explanation for it.Comment: 3 figure
- …