71 research outputs found
Homeostatic plasticity and external input shape neural network dynamics
In vitro and in vivo spiking activity clearly differ. Whereas networks in
vitro develop strong bursts separated by periods of very little spiking
activity, in vivo cortical networks show continuous activity. This is puzzling
considering that both networks presumably share similar single-neuron dynamics
and plasticity rules. We propose that the defining difference between in vitro
and in vivo dynamics is the strength of external input. In vitro, networks are
virtually isolated, whereas in vivo every brain area receives continuous input.
We analyze a model of spiking neurons in which the input strength, mediated by
spike rate homeostasis, determines the characteristics of the dynamical state.
In more detail, our analytical and numerical results on various network
topologies show consistently that under increasing input, homeostatic
plasticity generates distinct dynamic states, from bursting, to
close-to-critical, reverberating and irregular states. This implies that the
dynamic state of a neural network is not fixed but can readily adapt to the
input strengths. Indeed, our results match experimental spike recordings in
vitro and in vivo: the in vitro bursting behavior is consistent with a state
generated by very low network input (< 0.1%), whereas in vivo activity suggests
that on the order of 1% recorded spikes are input-driven, resulting in
reverberating dynamics. Importantly, this predicts that one can abolish the
ubiquitous bursts of in vitro preparations, and instead impose dynamics
comparable to in vivo activity by exposing the system to weak long-term
stimulation, thereby opening new paths to establish an in vivo-like assay in
vitro for basic as well as neurological studies.Comment: 14 pages, 8 figures, accepted at Phys. Rev.
Whole Brain Network Dynamics of Epileptic Seizures at Single Cell Resolution
Epileptic seizures are characterised by abnormal brain dynamics at multiple
scales, engaging single neurons, neuronal ensembles and coarse brain regions.
Key to understanding the cause of such emergent population dynamics, is
capturing the collective behaviour of neuronal activity at multiple brain
scales. In this thesis I make use of the larval zebrafish to capture single
cell neuronal activity across the whole brain during epileptic seizures.
Firstly, I make use of statistical physics methods to quantify the collective
behaviour of single neuron dynamics during epileptic seizures. Here, I
demonstrate a population mechanism through which single neuron dynamics
organise into seizures: brain dynamics deviate from a phase transition.
Secondly, I make use of single neuron network models to identify the synaptic
mechanisms that actually cause this shift to occur. Here, I show that the
density of neuronal connections in the network is key for driving generalised
seizure dynamics. Interestingly, such changes also disrupt network response
properties and flexible dynamics in brain networks, thus linking microscale
neuronal changes with emergent brain dysfunction during seizures. Thirdly, I
make use of non-linear causal inference methods to study the nature of the
underlying neuronal interactions that enable seizures to occur. Here I show
that seizures are driven by high synchrony but also by highly non-linear
interactions between neurons. Interestingly, these non-linear signatures are
filtered out at the macroscale, and therefore may represent a neuronal
signature that could be used for microscale interventional strategies. This
thesis demonstrates the utility of studying multi-scale dynamics in the larval
zebrafish, to link neuronal activity at the microscale with emergent properties
during seizures
Fractals in the Nervous System: conceptual Implications for Theoretical Neuroscience
This essay is presented with two principal objectives in mind: first, to
document the prevalence of fractals at all levels of the nervous system, giving
credence to the notion of their functional relevance; and second, to draw
attention to the as yet still unresolved issues of the detailed relationships
among power law scaling, self-similarity, and self-organized criticality. As
regards criticality, I will document that it has become a pivotal reference
point in Neurodynamics. Furthermore, I will emphasize the not yet fully
appreciated significance of allometric control processes. For dynamic fractals,
I will assemble reasons for attributing to them the capacity to adapt task
execution to contextual changes across a range of scales. The final Section
consists of general reflections on the implications of the reviewed data, and
identifies what appear to be issues of fundamental importance for future
research in the rapidly evolving topic of this review
Simple model of complex dynamics of activity patterns in developing networks of neuronal cultures
Living neuronal networks in dissociated neuronal cultures are widely known
for their ability to generate highly robust spatiotemporal activity patterns in
various experimental conditions. These include neuronal avalanches satisfying
the power scaling law and thereby exemplifying self-organized criticality in
living systems. A crucial question is how these patterns can be explained and
modeled in a way that is biologically meaningful, mathematically tractable and
yet broad enough to account for neuronal heterogeneity and complexity. Here we
propose a simple model which may offer an answer to this question. Our
derivations are based on just few phenomenological observations concerning
input-output behavior of an isolated neuron. A distinctive feature of the model
is that at the simplest level of description it comprises of only two
variables, a network activity variable and an exogenous variable corresponding
to energy needed to sustain the activity and modulate the efficacy of signal
transmission. Strikingly, this simple model is already capable of explaining
emergence of network spikes and bursts in developing neuronal cultures. The
model behavior and predictions are supported by empirical observations and
published experimental evidence on cultured neurons behavior exposed to oxygen
and energy deprivation. At the larger, network scale, introduction of the
energy-dependent regulatory mechanism enables the network to balance on the
edge of the network percolation transition. Network activity in this state
shows population bursts satisfying the scaling avalanche conditions. This
network state is self-sustainable and represents a balance between global
network-wide processes and spontaneous activity of individual elements
Self-organized Criticality in Neural Networks by Inhibitory and Excitatory Synaptic Plasticity
Neural networks show intrinsic ongoing activity even in the absence of information processing and task-driven activities. This spontaneous activity has been reported to have specific characteristics ranging from scale-free avalanches in microcircuits to the power-law decay of the power spectrum of oscillations in coarse-grained recordings of large populations of neurons. The emergence of scale-free activity and power-law distributions of observables has encouraged researchers to postulate that the neural system is operating near a continuous phase transition. At such a phase transition, changes in control parameters or the strength of the external input lead to a change in the macroscopic behavior of the system. On the other hand, at a critical point due to critical slowing down, the phenomenological mesoscopic modeling of the system becomes realizable. Two distinct types of phase transitions have been suggested as the operating point of the neural system, namely active-inactive and synchronous-asynchronous phase transitions.
In contrast to normal phase transitions in which a fine-tuning of the control parameter(s) is required to bring the system to the critical point, neural systems should be supplemented with self-tuning mechanisms that adaptively adjust the system near to the critical point (or critical region) in the phase space.
In this work, we introduce a self-organized critical model of the neural network. We consider dynamics of excitatory and inhibitory (EI) sparsely connected populations of spiking leaky integrate neurons with conductance-based synapses. Ignoring inhomogeneities and internal fluctuations, we first analyze the mean-field model. We choose the strength of the external excitatory input and the average strength of excitatory to excitatory synapses as control parameters of the model and analyze the bifurcation diagram of the mean-field equations. We focus on bifurcations at the low firing rate regime in which the quiescent state loses stability due to Saddle-node or Hopf bifurcations. In particular, at the Bogdanov-Takens (BT) bifurcation point which is the intersection of the Hopf bifurcation and Saddle-node bifurcation lines of the 2D dynamical system, the network shows avalanche dynamics with power-law avalanche size and duration distributions. This matches the characteristics of low firing spontaneous activity in the cortex. By linearizing gain functions and excitatory and inhibitory nullclines, we can approximate the location of the BT bifurcation point. This point in the control parameter phase space corresponds to the internal balance of excitation and inhibition and a slight excess of external excitatory input to the excitatory population. Due to the tight balance of average excitation and inhibition currents, the firing of the individual cells is fluctuation-driven. Around the BT point, the spiking of neurons is a Poisson process and the population average membrane potential of neurons is approximately at the middle of the operating interval . Moreover, the EI network is close to both oscillatory and active-inactive phase transition regimes.
Next, we consider self-tuning of the system at this critical point. The self-organizing parameter in our network is the balance of opposing forces of inhibitory and excitatory populations' activities and the self-organizing mechanisms are long-term synaptic plasticity and short-term depression of the synapses. The former tunes the overall strength of excitatory and inhibitory pathways to be close to a balanced regime of these currents and the latter which is based on the finite amount of resources in brain areas, act as an adaptive mechanism that tunes micro populations of neurons subjected to fluctuating external inputs to attain the balance in a wider range of external input strengths.
Using the Poisson firing assumption, we propose a microscopic Markovian model which captures the internal fluctuations in the network due to the finite size and matches the macroscopic mean-field equation by coarse-graining. Near the critical point, a phenomenological mesoscopic model for excitatory and inhibitory fields of activity is possible due to the time scale separation of slowly changing variables and fast degrees of freedom. We will show that the mesoscopic model corresponding to the neural field model near the local Bogdanov-Takens bifurcation point matches Langevin's description of the directed percolation process. Tuning the system at the critical point can be achieved by coupling fast population dynamics with slow adaptive gain and synaptic weight dynamics, which make the system wander around the phase transition point. Therefore, by introducing short-term and long-term synaptic plasticity, we have proposed a self-organized critical stochastic neural field model.:1. Introduction
1.1. Scale-free Spontaneous Activity
1.1.1. Nested Oscillations in the Macro-scale Collective Activity
1.1.2. Up and Down States Transitions
1.1.3. Avalanches in Local Neuronal Populations
1.2. Criticality and Self-organized Criticality in Systems out of Equilibrium
1.2.1. Sandpile Models
1.2.2. Directed Percolation
1.3. Critical Neural Models
1.3.1. Self-Organizing Neural Automata
1.3.2. Criticality in the Mesoscopic Models of Cortical Activity
1.4. Balance of Inhibition and Excitation
1.5. Functional Benefits of Being in the Critical State
1.6. Arguments Against the Critical State of the Brain
1.7. Organization of the Current Work
2. Single Neuron Model
2.1. Impulse Response of the Neuron
2.2. Response of the Neuron to the Constant Input
2.3. Response of the Neuron to the Poisson Input
2.3.1. Potential Distribution of a Neuron Receiving Poisson Input
2.3.2. Firing Rate and Interspike intervals’ CV Near the Threshold
2.3.3. Linear Poisson Neuron Approximation
3. Interconnected Homogeneous Population of Excitatory and Inhibitory Neurons
3.1. Linearized Nullclines and Different Dynamic Regimes
3.2. Logistic Function Approximation of Gain Functions
3.3. Dynamics Near the BT Bifurcation Point
3.4. Avalanches in the Region Close to the BT Point
3.5. Stability Analysis of the Fixed Points in the Linear Regime
3.6. Characteristics of Avalanches
4. Long Term and Short Term Synaptic Plasticity rules Tune the EI Population Close to the BT Bifurcation Point
4.1. Long Term Synaptic Plasticity by STDP Tunes Synaptic Weights Close to the Balanced State
4.2. Short-term plasticity and Up-Down states transition
5. Interconnected network of EI populations: Wilson-Cowan Neural Field Model
6. Stochastic Neural Field
6.1. Finite size fluctuations in a single EI population
6.2. Stochastic Neural Field with a Tuning Mechanism to the
Critical State
7. Conclusio
Autocorrelations from emergent bistability in homeostatic spiking neural networks on neuromorphic hardware
A fruitful approach towards neuromorphic computing is to mimic mechanisms of the brain in physical devices,
which has led to successful replication of neuronlike dynamics and learning in the past. However, there remains a
large set of neural self-organization mechanisms whose role for neuromorphic computing has yet to be explored.
One such mechanism is homeostatic plasticity, which has recently been proposed to play a key role in shaping
network dynamics and correlations. Here, we study—from a statistical-physics point of view—the emergent
collective dynamics in a homeostatically regulated neuromorphic device that emulates a network of excitatory
and inhibitory leaky integrate-and-fire neurons. Importantly, homeostatic plasticity is only active during the
training stage and results in a heterogeneous weight distribution that we fix during the analysis stage. We verify
the theoretical prediction that reducing the external input in a homeostatically regulated neural network increases
temporal correlations, measuring autocorrelation times exceeding 500 ms, despite single-neuron timescales of
only 20ms, both in experiments on neuromorphic hardware and in computer simulations. However, unlike
theoretically predicted near-critical fluctuations, we find that temporal correlations can originate from an
emergent bistability.We identify this bistability as a fluctuation-induced stochastic switching between metastable
active and quiescent states in the vicinity of a nonequilibrium phase transition. Our results thereby constitute a
complementary mechanism for emergent autocorrelations in networks of spiking neurons with implications for
future developments in neuromorphic computingEuropean
Union Sixth Framework Programme (FP6/2002-2006)Grant Agreement No. 15879 (FACETS)The European
Union Seventh Framework Programme (FP7/2007-2013) under
Grant Agreements No. 604102 (HBP),No. 269921
(BrainScaleS)No. 243914 (Brain-i-Nets)The Horizon
2020 Framework Programme (H2020/2014-2020)Grant Agreements No. 720270No. 785907No. 945539
(HBP)the Deutsche Forschungsgemeinschaft (DFG, German
Research Foundation) under Germany’s Excellence Strategy
EXC 2181/1-390900948 (the Heidelberg STRUCTURES
Excellence Cluster)The Helmholtz Association Initiative
and Networking Fund [Advanced Computing Architectures
(ACA)] under Project No. SO-092the Helmholtz Association Initiative
and Networking Fund [Advanced Computing Architectures
(ACA)] under Project No. SO-092The Spanish
Ministry and Agencia Estatal de Investigación (AEI)
through Project I+D+i (Reference No. PID2020-113681GBI00)MICIN/AEI/10.13039/501100011033 and
FEDER “A way to make EuropeConsejería
de Conocimiento, Investigación Universidad, Junta de Andalucía,
and European Regional Development FundProject
No. P20-00173The Plan Propio
de Investigación y Transferencia de la Universidad de
GranadaGrant No. INST 39/963-1 FUGG (bwForCluster NEM
Mesoscale Systems, Finite Size Effects, and Balanced Neural Networks
Cortical populations are typically in an asynchronous state, sporadically interrupted by brief epochs of coordinated population activity. Current cortical models are at a loss to explain this combination of states. At one extreme are network models where recurrent in- hibition dynamically stabilizes an asynchronous low activity state. While these networks are widely used they cannot produce the coherent population-wide activity that is reported in a variety of datasets. At the other extreme are models where short term synaptic depression between excitatory neurons can generate the epochs of population-wide activity. However, in these networks inhibition plays only a perfunctory role in network stability, which is at odds with many reports across cortex. In this study we analyze spontaneously active in vitro preparations of primary auditory cortex that show dynamics that are emblematic of this mix- ture of states. To capture this complex population activity we consider models where large excitation is balanced by recurrent inhibition yet we include short term synaptic depression dynamics of the excitatory connections. This model gives very rich nonlinear behavior that mimics the core features of the in vitro data, including the possibility of low frequency (2- 12 Hz) rhythmic dynamics within population events. Our study extends balanced network models to account for nonlinear, population-wide correlated activity, thereby providing a critical step in a mechanistic theory of realistic cortical activity. We further investigate an extension of this model that l exhibits clearly non-Arrhenius behavior, whereby lower noise systems may exhibit faster escape from a stable state. We show that this behavior is due to the system size dependent vector field, intrinsically linking noise and dynamics
- …