26,809 research outputs found
Noise-induced behaviors in neural mean field dynamics
The collective behavior of cortical neurons is strongly affected by the
presence of noise at the level of individual cells. In order to study these
phenomena in large-scale assemblies of neurons, we consider networks of
firing-rate neurons with linear intrinsic dynamics and nonlinear coupling,
belonging to a few types of cell populations and receiving noisy currents.
Asymptotic equations as the number of neurons tends to infinity (mean field
equations) are rigorously derived based on a probabilistic approach. These
equations are implicit on the probability distribution of the solutions which
generally makes their direct analysis difficult. However, in our case, the
solutions are Gaussian, and their moments satisfy a closed system of nonlinear
ordinary differential equations (ODEs), which are much easier to study than the
original stochastic network equations, and the statistics of the empirical
process uniformly converge towards the solutions of these ODEs. Based on this
description, we analytically and numerically study the influence of noise on
the collective behaviors, and compare these asymptotic regimes to simulations
of the network. We observe that the mean field equations provide an accurate
description of the solutions of the network equations for network sizes as
small as a few hundreds of neurons. In particular, we observe that the level of
noise in the system qualitatively modifies its collective behavior, producing
for instance synchronized oscillations of the whole network, desynchronization
of oscillating regimes, and stabilization or destabilization of stationary
solutions. These results shed a new light on the role of noise in shaping
collective dynamics of neurons, and gives us clues for understanding similar
phenomena observed in biological networks
Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions
In this manuscript we analyze the collective behavior of mean-field limits of
large-scale, spatially extended stochastic neuronal networks with delays.
Rigorously, the asymptotic regime of such systems is characterized by a very
intricate stochastic delayed integro-differential McKean-Vlasov equation that
remain impenetrable, leaving the stochastic collective dynamics of such
networks poorly understood. In order to study these macroscopic dynamics, we
analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics
and sigmoidal interactions. In that case, we prove that the solution of the
mean-field equation is Gaussian, hence characterized by its two first moments,
and that these two quantities satisfy a set of coupled delayed
integro-differential equations. These equations are similar to usual neural
field equations, and incorporate noise levels as a parameter, allowing analysis
of noise-induced transitions. We identify through bifurcation analysis several
qualitative transitions due to noise in the mean-field limit. In particular,
stabilization of spatially homogeneous solutions, synchronized oscillations,
bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from
static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow
further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence
in space of Brownian motion
Bifurcation analysis in an associative memory model
We previously reported the chaos induced by the frustration of interaction in
a non-monotonic sequential associative memory model, and showed the chaotic
behaviors at absolute zero. We have now analyzed bifurcation in a stochastic
system, namely a finite-temperature model of the non-monotonic sequential
associative memory model. We derived order-parameter equations from the
stochastic microscopic equations. Two-parameter bifurcation diagrams obtained
from those equations show the coexistence of attractors, which do not appear at
absolute zero, and the disappearance of chaos due to the temperature effect.Comment: 19 page
Finite-size and correlation-induced effects in Mean-field Dynamics
The brain's activity is characterized by the interaction of a very large
number of neurons that are strongly affected by noise. However, signals often
arise at macroscopic scales integrating the effect of many neurons into a
reliable pattern of activity. In order to study such large neuronal assemblies,
one is often led to derive mean-field limits summarizing the effect of the
interaction of a large number of neurons into an effective signal. Classical
mean-field approaches consider the evolution of a deterministic variable, the
mean activity, thus neglecting the stochastic nature of neural behavior. In
this article, we build upon two recent approaches that include correlations and
higher order moments in mean-field equations, and study how these stochastic
effects influence the solutions of the mean-field equations, both in the limit
of an infinite number of neurons and for large yet finite networks. We
introduce a new model, the infinite model, which arises from both equations by
a rescaling of the variables and, which is invertible for finite-size networks,
and hence, provides equivalent equations to those previously derived models.
The study of this model allows us to understand qualitative behavior of such
large-scale networks. We show that, though the solutions of the deterministic
mean-field equation constitute uncorrelated solutions of the new mean-field
equations, the stability properties of limit cycles are modified by the
presence of correlations, and additional non-trivial behaviors including
periodic orbits appear when there were none in the mean field. The origin of
all these behaviors is then explored in finite-size networks where interesting
mesoscopic scale effects appear. This study leads us to show that the
infinite-size system appears as a singular limit of the network equations, and
for any finite network, the system will differ from the infinite system
Propagation of chaos in neural fields
We consider the problem of the limit of bio-inspired spatially extended
neuronal networks including an infinite number of neuronal types (space
locations), with space-dependent propagation delays modeling neural fields. The
propagation of chaos property is proved in this setting under mild assumptions
on the neuronal dynamics, valid for most models used in neuroscience, in a
mesoscopic limit, the neural-field limit, in which we can resolve the quite
fine structure of the neuron's activity in space and where averaging effects
occur. The mean-field equations obtained are of a new type: they take the form
of well-posed infinite-dimensional delayed integro-differential equations with
a nonlocal mean-field term and a singular spatio-temporal Brownian motion. We
also show how these intricate equations can be used in practice to uncover
mathematically the precise mesoscopic dynamics of the neural field in a
particular model where the mean-field equations exactly reduce to deterministic
nonlinear delayed integro-differential equations. These results have several
theoretical implications in neuroscience we review in the discussion.Comment: Updated to correct an erroneous suggestion of extension of the
results in Appendix B, and to clarify some measurability questions in the
proof of Theorem
Noise-induced synchronization and anti-resonance in excitable systems; Implications for information processing in Parkinson's Disease and Deep Brain Stimulation
We study the statistical physics of a surprising phenomenon arising in large
networks of excitable elements in response to noise: while at low noise,
solutions remain in the vicinity of the resting state and large-noise solutions
show asynchronous activity, the network displays orderly, perfectly
synchronized periodic responses at intermediate level of noise. We show that
this phenomenon is fundamentally stochastic and collective in nature. Indeed,
for noise and coupling within specific ranges, an asymmetry in the transition
rates between a resting and an excited regime progressively builds up, leading
to an increase in the fraction of excited neurons eventually triggering a chain
reaction associated with a macroscopic synchronized excursion and a collective
return to rest where this process starts afresh, thus yielding the observed
periodic synchronized oscillations. We further uncover a novel anti-resonance
phenomenon: noise-induced synchronized oscillations disappear when the system
is driven by periodic stimulation with frequency within a specific range. In
that anti-resonance regime, the system is optimal for measures of information
capacity. This observation provides a new hypothesis accounting for the
efficiency of Deep Brain Stimulation therapies in Parkinson's disease, a
neurodegenerative disease characterized by an increased synchronization of
brain motor circuits. We further discuss the universality of these phenomena in
the class of stochastic networks of excitable elements with confining coupling,
and illustrate this universality by analyzing various classical models of
neuronal networks. Altogether, these results uncover some universal mechanisms
supporting a regularizing impact of noise in excitable systems, reveal a novel
anti-resonance phenomenon in these systems, and propose a new hypothesis for
the efficiency of high-frequency stimulation in Parkinson's disease
Intrinsic activity in the fly brain gates visual information during behavioral choices
The small insect brain is often described as an input/output system that executes reflex-like behaviors. It can also initiate neural activity and behaviors intrinsically, seen as spontaneous behaviors, different arousal states and sleep. However, less is known about how intrinsic activity in neural circuits affects sensory information processing in the insect brain and variability in behavior. Here, by simultaneously monitoring Drosophila's behavioral choices and brain activity in a flight simulator system, we identify intrinsic activity that is associated with the act of selecting between visual stimuli. We recorded neural output (multiunit action potentials and local field potentials) in the left and right optic lobes of a tethered flying Drosophila, while its attempts to follow visual motion (yaw torque) were measured by a torque meter. We show that when facing competing motion stimuli on its left and right, Drosophila typically generate large torque responses that flip from side to side. The delayed onset (0.1-1 s) and spontaneous switch-like dynamics of these responses, and the fact that the flies sometimes oppose the stimuli by flying straight, make this behavior different from the classic steering reflexes. Drosophila, thus, seem to choose one stimulus at a time and attempt to rotate toward its direction. With this behavior, the neural output of the optic lobes alternates; being augmented on the side chosen for body rotation and suppressed on the opposite side, even though the visual input to the fly eyes stays the same. Thus, the flow of information from the fly eyes is gated intrinsically. Such modulation can be noise-induced or intentional; with one possibility being that the fly brain highlights chosen information while ignoring the irrelevant, similar to what we know to occur in higher animals
One-Dimensional Population Density Approaches to Recurrently Coupled Networks of Neurons with Noise
Mean-field systems have been previously derived for networks of coupled,
two-dimensional, integrate-and-fire neurons such as the Izhikevich, adapting
exponential (AdEx) and quartic integrate and fire (QIF), among others.
Unfortunately, the mean-field systems have a degree of frequency error and the
networks analyzed often do not include noise when there is adaptation. Here, we
derive a one-dimensional partial differential equation (PDE) approximation for
the marginal voltage density under a first order moment closure for coupled
networks of integrate-and-fire neurons with white noise inputs. The PDE has
substantially less frequency error than the mean-field system, and provides a
great deal more information, at the cost of analytical tractability. The
convergence properties of the mean-field system in the low noise limit are
elucidated. A novel method for the analysis of the stability of the
asynchronous tonic firing solution is also presented and implemented. Unlike
previous attempts at stability analysis with these network types, information
about the marginal densities of the adaptation variables is used. This method
can in principle be applied to other systems with nonlinear partial
differential equations.Comment: 26 Pages, 6 Figure
- …