964 research outputs found
A constructive mean field analysis of multi population neural networks with random synaptic weights and stochastic inputs
We deal with the problem of bridging the gap between two scales in neuronal
modeling. At the first (microscopic) scale, neurons are considered individually
and their behavior described by stochastic differential equations that govern
the time variations of their membrane potentials. They are coupled by synaptic
connections acting on their resulting activity, a nonlinear function of their
membrane potential. At the second (mesoscopic) scale, interacting populations
of neurons are described individually by similar equations. The equations
describing the dynamical and the stationary mean field behaviors are considered
as functional equations on a set of stochastic processes. Using this new point
of view allows us to prove that these equations are well-posed on any finite
time interval and to provide a constructive method for effectively computing
their unique solution. This method is proved to converge to the unique solution
and we characterize its complexity and convergence rate. We also provide
partial results for the stationary problem on infinite time intervals. These
results shed some new light on such neural mass models as the one of Jansen and
Rit \cite{jansen-rit:95}: their dynamics appears as a coarse approximation of
the much richer dynamics that emerges from our analysis. Our numerical
experiments confirm that the framework we propose and the numerical methods we
derive from it provide a new and powerful tool for the exploration of neural
behaviors at different scales.Comment: 55 pages, 4 figures, to appear in "Frontiers in Neuroscience
Stochastic firing rate models
We review a recent approach to the mean-field limits in neural networks that
takes into account the stochastic nature of input current and the uncertainty
in synaptic coupling. This approach was proved to be a rigorous limit of the
network equations in a general setting, and we express here the results in a
more customary and simpler framework. We propose a heuristic argument to derive
these equations providing a more intuitive understanding of their origin. These
equations are characterized by a strong coupling between the different moments
of the solutions. We analyse the equations, present an algorithm to simulate
the solutions of these mean-field equations, and investigate numerically the
equations. In particular, we build a bridge between these equations and
Sompolinsky and collaborators approach (1988, 1990), and show how the coupling
between the mean and the covariance function deviates from customary
approaches
Spontaneous and stimulus-induced coherent states of critically balanced neuronal networks
How the information microscopically processed by individual neurons is
integrated and used in organizing the behavior of an animal is a central
question in neuroscience. The coherence of neuronal dynamics over different
scales has been suggested as a clue to the mechanisms underlying this
integration. Balanced excitation and inhibition may amplify microscopic
fluctuations to a macroscopic level, thus providing a mechanism for generating
coherent multiscale dynamics. Previous theories of brain dynamics, however,
were restricted to cases in which inhibition dominated excitation and
suppressed fluctuations in the macroscopic population activity. In the present
study, we investigate the dynamics of neuronal networks at a critical point
between excitation-dominant and inhibition-dominant states. In these networks,
the microscopic fluctuations are amplified by the strong excitation and
inhibition to drive the macroscopic dynamics, while the macroscopic dynamics
determine the statistics of the microscopic fluctuations. Developing a novel
type of mean-field theory applicable to this class of interscale interactions,
we show that the amplification mechanism generates spontaneous, irregular
macroscopic rhythms similar to those observed in the brain. Through the same
mechanism, microscopic inputs to a small number of neurons effectively entrain
the dynamics of the whole network. These network dynamics undergo a
probabilistic transition to a coherent state, as the magnitude of either the
balanced excitation and inhibition or the external inputs is increased. Our
mean-field theory successfully predicts the behavior of this model.
Furthermore, we numerically demonstrate that the coherent dynamics can be used
for state-dependent read-out of information from the network. These results
show a novel form of neuronal information processing that connects neuronal
dynamics on different scales.Comment: 20 pages 12 figures (main text) + 23 pages 6 figures (Appendix); Some
of the results have been removed in the revision in order to reduce the
volume. See the previous version for more result
Modeling networks of spiking neurons as interacting processes with memory of variable length
We consider a new class of non Markovian processes with a countable number of
interacting components, both in discrete and continuous time. Each component is
represented by a point process indicating if it has a spike or not at a given
time. The system evolves as follows. For each component, the rate (in
continuous time) or the probability (in discrete time) of having a spike
depends on the entire time evolution of the system since the last spike time of
the component. In discrete time this class of systems extends in a non trivial
way both Spitzer's interacting particle systems, which are Markovian, and
Rissanen's stochastic chains with memory of variable length which have finite
state space. In continuous time they can be seen as a kind of Rissanen's
variable length memory version of the class of self-exciting point processes
which are also called "Hawkes processes", however with infinitely many
components. These features make this class a good candidate to describe the
time evolution of networks of spiking neurons. In this article we present a
critical reader's guide to recent papers dealing with this class of models,
both in discrete and in continuous time. We briefly sketch results concerning
perfect simulation and existence issues, de-correlation between successive
interspike intervals, the longtime behavior of finite non-excited systems and
propagation of chaos in mean field systems
The complexity of dynamics in small neural circuits
Mean-field theory is a powerful tool for studying large neural networks.
However, when the system is composed of a few neurons, macroscopic differences
between the mean-field approximation and the real behavior of the network can
arise. Here we introduce a study of the dynamics of a small firing-rate network
with excitatory and inhibitory populations, in terms of local and global
bifurcations of the neural activity. Our approach is analytically tractable in
many respects, and sheds new light on the finite-size effects of the system. In
particular, we focus on the formation of multiple branching solutions of the
neural equations through spontaneous symmetry-breaking, since this phenomenon
increases considerably the complexity of the dynamical behavior of the network.
For these reasons, branching points may reveal important mechanisms through
which neurons interact and process information, which are not accounted for by
the mean-field approximation.Comment: 34 pages, 11 figures. Supplementary materials added, colors of
figures 8 and 9 fixed, results unchange
Front propagation in stochastic neural fields
We analyse the effects of extrinsic multiplicative noise on front propagation in a scalar neural field with excitatory connections. Using a separation of time scales, we represent the fluctuating front in terms of a diffusiveâlike displacement (wandering) of the front from its uniformly translating position at long time scales, and fluctuations in the front profile around its instantaneous position at short time scales. One major result of our analysis is a comparison between freely propagating fronts and fronts locked to an externally moving stimulus. We show that the latter are much more robust to noise, since the stochastic wandering of the mean front profile is described by an OrnsteinâUhlenbeck process rather than a Wiener process, so that the variance in front position saturates in the long time limit rather than increasing linearly with time. Finally, we consider a stochastic neural field that supports a pulled front in the deterministic limit, and show that the wandering of such a front is now subdiffusive
Noise-induced behaviors in neural mean field dynamics
The collective behavior of cortical neurons is strongly affected by the
presence of noise at the level of individual cells. In order to study these
phenomena in large-scale assemblies of neurons, we consider networks of
firing-rate neurons with linear intrinsic dynamics and nonlinear coupling,
belonging to a few types of cell populations and receiving noisy currents.
Asymptotic equations as the number of neurons tends to infinity (mean field
equations) are rigorously derived based on a probabilistic approach. These
equations are implicit on the probability distribution of the solutions which
generally makes their direct analysis difficult. However, in our case, the
solutions are Gaussian, and their moments satisfy a closed system of nonlinear
ordinary differential equations (ODEs), which are much easier to study than the
original stochastic network equations, and the statistics of the empirical
process uniformly converge towards the solutions of these ODEs. Based on this
description, we analytically and numerically study the influence of noise on
the collective behaviors, and compare these asymptotic regimes to simulations
of the network. We observe that the mean field equations provide an accurate
description of the solutions of the network equations for network sizes as
small as a few hundreds of neurons. In particular, we observe that the level of
noise in the system qualitatively modifies its collective behavior, producing
for instance synchronized oscillations of the whole network, desynchronization
of oscillating regimes, and stabilization or destabilization of stationary
solutions. These results shed a new light on the role of noise in shaping
collective dynamics of neurons, and gives us clues for understanding similar
phenomena observed in biological networks
Mean Field description of and propagation of chaos in recurrent multipopulation networks of Hodgkin-Huxley and Fitzhugh-Nagumo neurons
We derive the mean-field equations arising as the limit of a network of
interacting spiking neurons, as the number of neurons goes to infinity. The
neurons belong to a fixed number of populations and are represented either by
the Hodgkin-Huxley model or by one of its simplified version, the
Fitzhugh-Nagumo model. The synapses between neurons are either electrical or
chemical. The network is assumed to be fully connected. The maximum
conductances vary randomly. Under the condition that all neurons initial
conditions are drawn independently from the same law that depends only on the
population they belong to, we prove that a propagation of chaos phenomenon
takes places, namely that in the mean-field limit, any finite number of neurons
become independent and, within each population, have the same probability
distribution. This probability distribution is solution of a set of implicit
equations, either nonlinear stochastic differential equations resembling the
McKean-Vlasov equations, or non-local partial differential equations resembling
the McKean-Vlasov-Fokker- Planck equations. We prove the well-posedness of
these equations, i.e. the existence and uniqueness of a solution. We also show
the results of some preliminary numerical experiments that indicate that the
mean-field equations are a good representation of the mean activity of a finite
size network, even for modest sizes. These experiment also indicate that the
McKean-Vlasov-Fokker- Planck equations may be a good way to understand the
mean-field dynamics through, e.g., a bifurcation analysis.Comment: 55 pages, 9 figure
- âŠ