100 research outputs found
A microscopic mechanism for self-organized quasi periodicity in random networks of non linear oscillators
Self-organized quasi periodicity is one of the most puzzling dynamical phases
observed in systems of non linear coupled oscillators. The single dynamical
units are not locked to the periodic mean field they produce, but they still
feature a coherent behavior, through an unexplained complex form of
correlation. We consider a class of leaky integrate-and-fire oscillators on
random sparse and massive networks with dynamical synapses, featuring
self-organized quasi periodicity, and we show how complex collective
oscillations arise from constructive interference of microscopic dynamics. In
particular, we find a simple quantitative relationship between two relevant
microscopic dynamical time scales and the macroscopic time scale of the global
signal. We show that the proposed relation is a general property of collective
oscillations, common to all the partially synchronous dynamical phases
analyzed. We argue that an analogous mechanism could be at the origin of
similar network dynamics.Comment: to appear in Phys. Rev.
Heterogeneous Mean Field for neural networks with short term plasticity
We report about the main dynamical features of a model of leaky-integrate-and
fire excitatory neurons with short term plasticity defined on random massive
networks. We investigate the dynamics by a Heterogeneous Mean-Field formulation
of the model, that is able to reproduce dynamical phases characterized by the
presence of quasi-synchronous events. This formulation allows one to solve also
the inverse problem of reconstructing the in-degree distribution for different
network topologies from the knowledge of the global activity field. We study
the robustness of this inversion procedure, by providing numerical evidence
that the in-degree distribution can be recovered also in the presence of noise
and disorder in the external currents. Finally, we discuss the validity of the
heterogeneous mean-field approach for sparse networks, with a sufficiently
large average in-degree
Chaos and correlated avalanches in excitatory neural networks with synaptic plasticity
A collective chaotic phase with power law scaling of activity events is
observed in a disordered mean field network of purely excitatory leaky
integrate-and-fire neurons with short-term synaptic plasticity. The dynamical
phase diagram exhibits two transitions from quasi-synchronous and asynchronous
regimes to the nontrivial, collective, bursty regime with avalanches. In the
homogeneous case without disorder, the system synchronizes and the bursty
behavior is reflected into a doubling-period transition to chaos for a two
dimensional discrete map. Numerical simulations show that the bursty chaotic
phase with avalanches exhibits a spontaneous emergence of time correlations and
enhanced Kolmogorov complexity. Our analysis reveals a mechanism for the
generation of irregular avalanches that emerges from the combination of
disorder and deterministic underlying chaotic dynamics.Comment: 5 pages 5 figures; SI 26 pages 14 figures. Improved editing, 3
subsections added in S
Average synaptic activity and neural networks topology: a global inverse problem
The dynamics of neural networks is often characterized by collective behavior
and quasi-synchronous events, where a large fraction of neurons fire in short
time intervals, separated by uncorrelated firing activity. These global
temporal signals are crucial for brain functioning. They strongly depend on the
topology of the network and on the fluctuations of the connectivity. We propose
a heterogeneous mean--field approach to neural dynamics on random networks,
that explicitly preserves the disorder in the topology at growing network
sizes, and leads to a set of self-consistent equations. Within this approach,
we provide an effective description of microscopic and large scale temporal
signals in a leaky integrate-and-fire model with short term plasticity, where
quasi-synchronous events arise. Our equations provide a clear analytical
picture of the dynamics, evidencing the contributions of both periodic (locked)
and aperiodic (unlocked) neurons to the measurable average signal. In
particular, we formulate and solve a global inverse problem of reconstructing
the in-degree distribution from the knowledge of the average activity field.
Our method is very general and applies to a large class of dynamical models on
dense random networks
Synchronous dynamics in the presence of short-term plasticity
We investigate the occurrence of quasisynchronous events in a random network of excitatory leaky integrate-and-fire neurons equipped with short-term plasticity. The dynamics is analyzed by monitoring both the evolution of global synaptic variables and, on a microscopic ground, the interspike intervals of the individual neurons. We find that quasisynchronous events are the result of a mixture of synchronized and unsynchronized motion, analogously to the emergence of synchronization in the Kuramoto model. In the present context, disorder is due to the random structure of the network and thereby vanishes for a diverging network size (i.e., in the thermodynamic limit), when statistical fluctuations become negligible. Remarkably, the fraction of asynchronous neurons remains strictly larger than zero for arbitrarily large . This is due to the presence of a robust homoclinic cycle in the self-generated synchronous dynamics. The nontrivial large- behavior is confirmed by the anomalous scaling of the maximum Lyapunov exponent, which is strictly positive in a finite network and decreases as {N}^{\ensuremath{-}0.27}. Finally, we have checked the robustness of this dynamical phase with respect to the addition of noise, applied to either the reset potential or the leaky current
Discrete synaptic events induce global oscillations in balanced neural networks
Neural dynamics is triggered by discrete synaptic inputs of finite amplitude.
However, the neural response is usually obtained within the diffusion
approximation (DA) representing the synaptic inputs as Gaussian noise. We
derive a mean-field formalism encompassing synaptic shot-noise for sparse
balanced networks of spiking neurons. For low (high) external drives (synaptic
strengths) irregular global oscillations emerge via continuous and hysteretic
transitions, correctly predicted by our approach, but not from the DA. These
oscillations display frequencies in biologically relevant bands.Comment: 6 pages, 3 figure
A reduction methodology for fluctuation driven population dynamics
Lorentzian distributions have been largely employed in statistical mechanics
to obtain exact results for heterogeneous systems. Analytic continuation of
these results is impossible even for slightly deformed Lorentzian
distributions, due to the divergence of all the moments (cumulants). We have
solved this problem by introducing a `pseudo-cumulants' expansion. This allows
us to develop a reduction methodology for heterogeneous spiking neural networks
subject to extrinsinc and endogenous noise sources, thus generalizing the
mean-field formulation introduced in [E. Montbri\'o et al., Phys. Rev. X 5,
021028 (2015)].Comment: 10 pages (with supplementary materials), 3 figure
A new generation of reduction methods for networks of neurons with complex dynamic phenotypes
Collective dynamics of spiking networks of neurons has been of central
interest to both computation neuroscience and network science. Over the past
years a new generation of neural population models based on exact reductions
(ER) of spiking networks have been developed. However, most of these efforts
have been limited to networks of neurons with simple dynamics (e.g. the
quadratic integrate and fire models). Here, we present an extension of ER to
conductance-based networks of two-dimensional Izhikevich neuron models. We
employ an adiabatic approximation, which allows us to analytically solve the
continuity equation describing the evolution of the state of the neural
population and thus to reduce model dimensionality. We validate our results by
showing that the reduced mean-field description we derived can qualitatively
and quantitatively describe the macroscopic behaviour of populations of
two-dimensional QIF neurons with different electrophysiological profiles
(regular firing, adapting, resonator and type III excitable). Most notably, we
apply this technique to develop an ER for networks of neurons with bursting
dynamics.Comment: reduction method for bursting neurons adde
Biologically realistic mean-field models of conductance-based networks of spiking neurons with adaptation
International audienceAccurate population models are needed to build very large scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlin-ear properties are involved such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of Adaptive Exponential Integrate and Fire excitatory and inhibitory neurons. Using a Master Equation formalism , we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable to correctly predict the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high and low activity states alternate (UP-DOWN state dynamics), leading to slow oscillations. We conclude that such mean-field models are "biologically realistic" in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large scale models involving multiple brain areas
- …