475 research outputs found
A mean-field model for conductance-based networks of adaptive exponential integrate-and-fire neurons
Voltage-sensitive dye imaging (VSDi) has revealed fundamental properties of
neocortical processing at mesoscopic scales. Since VSDi signals report the
average membrane potential, it seems natural to use a mean-field formalism to
model such signals. Here, we investigate a mean-field model of networks of
Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based
synaptic interactions. The AdEx model can capture the spiking response of
different cell types, such as regular-spiking (RS) excitatory neurons and
fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism,
together with a semi-analytic approach to the transfer function of AdEx
neurons. We compare the predictions of this mean-field model to simulated
networks of RS-FS cells, first at the level of the spontaneous activity of the
network, which is well predicted by the mean-field model. Second, we
investigate the response of the network to time-varying external input, and
show that the mean-field model accurately predicts the response time course of
the population. One notable exception was that the "tail" of the response at
long times was not well predicted, because the mean-field does not include
adaptation mechanisms. We conclude that the Master Equation formalism can yield
mean-field models that predict well the behavior of nonlinear networks with
conductance-based interactions and various electrophysiolgical properties, and
should be a good candidate to model VSDi signals where both excitatory and
inhibitory neurons contribute.Comment: 21 pages, 7 figure
Stochastic neural field theory and the system-size expansion
We analyze a master equation formulation of stochastic neurodynamics for a network of synaptically coupled homogeneous neuronal populations each consisting of N identical neurons. The state of the network is specified by the fraction of active or spiking neurons in each population, and transition rates are chosen so that in the thermodynamic or deterministic limit (N → ∞) we recover standard activity–based or voltage–based rate models. We derive the lowest order corrections to these rate equations for large but finite N using two different approximation schemes, one based on the Van Kampen system-size expansion and the other based on path integral methods. Both methods yield the same series expansion of the moment equations, which at O(1/N ) can be truncated to form a closed system of equations for the first and second order moments. Taking a continuum limit of the moment equations whilst keeping the system size N fixed generates a system of integrodifferential equations for the mean and covariance of the corresponding stochastic neural field model. We also show how the path integral approach can be used to study large deviation or rare event statistics underlying escape from the basin of attraction of a stable fixed point of the mean–field dynamics; such an analysis is not possible using the system-size expansion since the latter cannot accurately\ud
determine exponentially small transitions
Biologically realistic mean-field models of conductance-based networks of spiking neurons with adaptation
International audienceAccurate population models are needed to build very large scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlin-ear properties are involved such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of Adaptive Exponential Integrate and Fire excitatory and inhibitory neurons. Using a Master Equation formalism , we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable to correctly predict the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high and low activity states alternate (UP-DOWN state dynamics), leading to slow oscillations. We conclude that such mean-field models are "biologically realistic" in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large scale models involving multiple brain areas
Finite-size and correlation-induced effects in Mean-field Dynamics
The brain's activity is characterized by the interaction of a very large
number of neurons that are strongly affected by noise. However, signals often
arise at macroscopic scales integrating the effect of many neurons into a
reliable pattern of activity. In order to study such large neuronal assemblies,
one is often led to derive mean-field limits summarizing the effect of the
interaction of a large number of neurons into an effective signal. Classical
mean-field approaches consider the evolution of a deterministic variable, the
mean activity, thus neglecting the stochastic nature of neural behavior. In
this article, we build upon two recent approaches that include correlations and
higher order moments in mean-field equations, and study how these stochastic
effects influence the solutions of the mean-field equations, both in the limit
of an infinite number of neurons and for large yet finite networks. We
introduce a new model, the infinite model, which arises from both equations by
a rescaling of the variables and, which is invertible for finite-size networks,
and hence, provides equivalent equations to those previously derived models.
The study of this model allows us to understand qualitative behavior of such
large-scale networks. We show that, though the solutions of the deterministic
mean-field equation constitute uncorrelated solutions of the new mean-field
equations, the stability properties of limit cycles are modified by the
presence of correlations, and additional non-trivial behaviors including
periodic orbits appear when there were none in the mean field. The origin of
all these behaviors is then explored in finite-size networks where interesting
mesoscopic scale effects appear. This study leads us to show that the
infinite-size system appears as a singular limit of the network equations, and
for any finite network, the system will differ from the infinite system
Vast TVB parameter space exploration: A Modular Framework for Accelerating the Multi-Scale Simulation of Human Brain Dynamics
Global neural dynamics emerge from multi-scale brain structures, with neurons
communicating through synapses to form transiently communicating networks.
Network activity arises from intercellular communication that depends on the
structure of connectome tracts and local connection, intracellular signalling
cascades, and the extracellular molecular milieu that regulate cellular
properties. Multi-scale models of brain function have begun to directly link
the emergence of global brain dynamics in conscious and unconscious brain
states to microscopic changes at the level of cells. In particular, AdEx
mean-field models representing statistical properties of local populations of
neurons have been connected following human tractography data to represent
multi-scale neural phenomena in simulations using The Virtual Brain (TVB).
While mean-field models can be run on personal computers for short simulations,
or in parallel on high-performance computing (HPC) architectures for longer
simulations and parameter scans, the computational burden remains high and vast
areas of the parameter space remain unexplored. In this work, we report that
our TVB-HPC framework, a modular set of methods used here to implement the
TVB-AdEx model for GPU and analyze emergent dynamics, notably accelerates
simulations and substantially reduces computational resource requirements. The
framework preserves the stability and robustness of the TVB-AdEx model, thus
facilitating finer resolution exploration of vast parameter spaces as well as
longer simulations previously near impossible to perform. Given that simulation
and analysis toolkits are made public as open-source packages, our framework
serves as a template onto which other models can be easily scripted and
personalized datasets can be used for studies of inter-individual variability
of parameters related to functional brain dynamics.Comment: 21 pages, 9 figure
Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions
In this manuscript we analyze the collective behavior of mean-field limits of
large-scale, spatially extended stochastic neuronal networks with delays.
Rigorously, the asymptotic regime of such systems is characterized by a very
intricate stochastic delayed integro-differential McKean-Vlasov equation that
remain impenetrable, leaving the stochastic collective dynamics of such
networks poorly understood. In order to study these macroscopic dynamics, we
analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics
and sigmoidal interactions. In that case, we prove that the solution of the
mean-field equation is Gaussian, hence characterized by its two first moments,
and that these two quantities satisfy a set of coupled delayed
integro-differential equations. These equations are similar to usual neural
field equations, and incorporate noise levels as a parameter, allowing analysis
of noise-induced transitions. We identify through bifurcation analysis several
qualitative transitions due to noise in the mean-field limit. In particular,
stabilization of spatially homogeneous solutions, synchronized oscillations,
bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from
static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow
further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence
in space of Brownian motion
Stochastic firing rate models
We review a recent approach to the mean-field limits in neural networks that
takes into account the stochastic nature of input current and the uncertainty
in synaptic coupling. This approach was proved to be a rigorous limit of the
network equations in a general setting, and we express here the results in a
more customary and simpler framework. We propose a heuristic argument to derive
these equations providing a more intuitive understanding of their origin. These
equations are characterized by a strong coupling between the different moments
of the solutions. We analyse the equations, present an algorithm to simulate
the solutions of these mean-field equations, and investigate numerically the
equations. In particular, we build a bridge between these equations and
Sompolinsky and collaborators approach (1988, 1990), and show how the coupling
between the mean and the covariance function deviates from customary
approaches
Noise-induced behaviors in neural mean field dynamics
The collective behavior of cortical neurons is strongly affected by the
presence of noise at the level of individual cells. In order to study these
phenomena in large-scale assemblies of neurons, we consider networks of
firing-rate neurons with linear intrinsic dynamics and nonlinear coupling,
belonging to a few types of cell populations and receiving noisy currents.
Asymptotic equations as the number of neurons tends to infinity (mean field
equations) are rigorously derived based on a probabilistic approach. These
equations are implicit on the probability distribution of the solutions which
generally makes their direct analysis difficult. However, in our case, the
solutions are Gaussian, and their moments satisfy a closed system of nonlinear
ordinary differential equations (ODEs), which are much easier to study than the
original stochastic network equations, and the statistics of the empirical
process uniformly converge towards the solutions of these ODEs. Based on this
description, we analytically and numerically study the influence of noise on
the collective behaviors, and compare these asymptotic regimes to simulations
of the network. We observe that the mean field equations provide an accurate
description of the solutions of the network equations for network sizes as
small as a few hundreds of neurons. In particular, we observe that the level of
noise in the system qualitatively modifies its collective behavior, producing
for instance synchronized oscillations of the whole network, desynchronization
of oscillating regimes, and stabilization or destabilization of stationary
solutions. These results shed a new light on the role of noise in shaping
collective dynamics of neurons, and gives us clues for understanding similar
phenomena observed in biological networks
Mean Field description of and propagation of chaos in recurrent multipopulation networks of Hodgkin-Huxley and Fitzhugh-Nagumo neurons
We derive the mean-field equations arising as the limit of a network of
interacting spiking neurons, as the number of neurons goes to infinity. The
neurons belong to a fixed number of populations and are represented either by
the Hodgkin-Huxley model or by one of its simplified version, the
Fitzhugh-Nagumo model. The synapses between neurons are either electrical or
chemical. The network is assumed to be fully connected. The maximum
conductances vary randomly. Under the condition that all neurons initial
conditions are drawn independently from the same law that depends only on the
population they belong to, we prove that a propagation of chaos phenomenon
takes places, namely that in the mean-field limit, any finite number of neurons
become independent and, within each population, have the same probability
distribution. This probability distribution is solution of a set of implicit
equations, either nonlinear stochastic differential equations resembling the
McKean-Vlasov equations, or non-local partial differential equations resembling
the McKean-Vlasov-Fokker- Planck equations. We prove the well-posedness of
these equations, i.e. the existence and uniqueness of a solution. We also show
the results of some preliminary numerical experiments that indicate that the
mean-field equations are a good representation of the mean activity of a finite
size network, even for modest sizes. These experiment also indicate that the
McKean-Vlasov-Fokker- Planck equations may be a good way to understand the
mean-field dynamics through, e.g., a bifurcation analysis.Comment: 55 pages, 9 figure
- …