5,461 research outputs found
Structured chaos shapes spike-response noise entropy in balanced neural networks
Large networks of sparsely coupled, excitatory and inhibitory cells occur
throughout the brain. A striking feature of these networks is that they are
chaotic. How does this chaos manifest in the neural code? Specifically, how
variable are the spike patterns that such a network produces in response to an
input signal? To answer this, we derive a bound for the entropy of multi-cell
spike pattern distributions in large recurrent networks of spiking neurons
responding to fluctuating inputs. The analysis is based on results from random
dynamical systems theory and is complimented by detailed numerical simulations.
We find that the spike pattern entropy is an order of magnitude lower than what
would be extrapolated from single cells. This holds despite the fact that
network coupling becomes vanishingly sparse as network size grows -- a
phenomenon that depends on ``extensive chaos," as previously discovered for
balanced networks without stimulus drive. Moreover, we show how spike pattern
entropy is controlled by temporal features of the inputs. Our findings provide
insight into how neural networks may encode stimuli in the presence of
inherently chaotic dynamics.Comment: 9 pages, 5 figure
Mean-field equations for stochastic firing-rate neural fields with delays: Derivation and noise-induced transitions
In this manuscript we analyze the collective behavior of mean-field limits of
large-scale, spatially extended stochastic neuronal networks with delays.
Rigorously, the asymptotic regime of such systems is characterized by a very
intricate stochastic delayed integro-differential McKean-Vlasov equation that
remain impenetrable, leaving the stochastic collective dynamics of such
networks poorly understood. In order to study these macroscopic dynamics, we
analyze networks of firing-rate neurons, i.e. with linear intrinsic dynamics
and sigmoidal interactions. In that case, we prove that the solution of the
mean-field equation is Gaussian, hence characterized by its two first moments,
and that these two quantities satisfy a set of coupled delayed
integro-differential equations. These equations are similar to usual neural
field equations, and incorporate noise levels as a parameter, allowing analysis
of noise-induced transitions. We identify through bifurcation analysis several
qualitative transitions due to noise in the mean-field limit. In particular,
stabilization of spatially homogeneous solutions, synchronized oscillations,
bumps, chaotic dynamics, wave or bump splitting are exhibited and arise from
static or dynamic Turing-Hopf bifurcations. These surprising phenomena allow
further exploring the role of noise in the nervous system.Comment: Updated to the latest version published, and clarified the dependence
in space of Brownian motion
Spontaneous and stimulus-induced coherent states of critically balanced neuronal networks
How the information microscopically processed by individual neurons is
integrated and used in organizing the behavior of an animal is a central
question in neuroscience. The coherence of neuronal dynamics over different
scales has been suggested as a clue to the mechanisms underlying this
integration. Balanced excitation and inhibition may amplify microscopic
fluctuations to a macroscopic level, thus providing a mechanism for generating
coherent multiscale dynamics. Previous theories of brain dynamics, however,
were restricted to cases in which inhibition dominated excitation and
suppressed fluctuations in the macroscopic population activity. In the present
study, we investigate the dynamics of neuronal networks at a critical point
between excitation-dominant and inhibition-dominant states. In these networks,
the microscopic fluctuations are amplified by the strong excitation and
inhibition to drive the macroscopic dynamics, while the macroscopic dynamics
determine the statistics of the microscopic fluctuations. Developing a novel
type of mean-field theory applicable to this class of interscale interactions,
we show that the amplification mechanism generates spontaneous, irregular
macroscopic rhythms similar to those observed in the brain. Through the same
mechanism, microscopic inputs to a small number of neurons effectively entrain
the dynamics of the whole network. These network dynamics undergo a
probabilistic transition to a coherent state, as the magnitude of either the
balanced excitation and inhibition or the external inputs is increased. Our
mean-field theory successfully predicts the behavior of this model.
Furthermore, we numerically demonstrate that the coherent dynamics can be used
for state-dependent read-out of information from the network. These results
show a novel form of neuronal information processing that connects neuronal
dynamics on different scales.Comment: 20 pages 12 figures (main text) + 23 pages 6 figures (Appendix); Some
of the results have been removed in the revision in order to reduce the
volume. See the previous version for more result
- …