4,141 research outputs found
Intrinsic adaptation in autonomous recurrent neural networks
A massively recurrent neural network responds on one side to input stimuli
and is autonomously active, on the other side, in the absence of sensory
inputs. Stimuli and information processing depends crucially on the qualia of
the autonomous-state dynamics of the ongoing neural activity. This default
neural activity may be dynamically structured in time and space, showing
regular, synchronized, bursting or chaotic activity patterns.
We study the influence of non-synaptic plasticity on the default dynamical
state of recurrent neural networks. The non-synaptic adaption considered acts
on intrinsic neural parameters, such as the threshold and the gain, and is
driven by the optimization of the information entropy. We observe, in the
presence of the intrinsic adaptation processes, three distinct and globally
attracting dynamical regimes, a regular synchronized, an overall chaotic and an
intermittent bursting regime. The intermittent bursting regime is characterized
by intervals of regular flows, which are quite insensitive to external stimuli,
interseeded by chaotic bursts which respond sensitively to input signals. We
discuss these finding in the context of self-organized information processing
and critical brain dynamics.Comment: 24 pages, 8 figure
Death and rebirth of neural activity in sparse inhibitory networks
In this paper, we clarify the mechanisms underlying a general phenomenon
present in pulse-coupled heterogeneous inhibitory networks: inhibition can
induce not only suppression of the neural activity, as expected, but it can
also promote neural reactivation. In particular, for globally coupled systems,
the number of firing neurons monotonically reduces upon increasing the strength
of inhibition (neurons' death). However, the random pruning of the connections
is able to reverse the action of inhibition, i.e. in a sparse network a
sufficiently strong synaptic strength can surprisingly promote, rather than
depress, the activity of the neurons (neurons' rebirth). Thus the number of
firing neurons reveals a minimum at some intermediate synaptic strength. We
show that this minimum signals a transition from a regime dominated by the
neurons with higher firing activity to a phase where all neurons are
effectively sub-threshold and their irregular firing is driven by current
fluctuations. We explain the origin of the transition by deriving an analytic
mean field formulation of the problem able to provide the fraction of active
neurons as well as the first two moments of their firing statistics. The
introduction of a synaptic time scale does not modify the main aspects of the
reported phenomenon. However, for sufficiently slow synapses the transition
becomes dramatic, the system passes from a perfectly regular evolution to an
irregular bursting dynamics. In this latter regime the model provides
predictions consistent with experimental findings for a specific class of
neurons, namely the medium spiny neurons in the striatum.Comment: 19 pages, 10 figures, submitted to NJ
Relevance of Dynamic Clustering to Biological Networks
Network of nonlinear dynamical elements often show clustering of
synchronization by chaotic instability. Relevance of the clustering to
ecological, immune, neural, and cellular networks is discussed, with the
emphasis of partially ordered states with chaotic itinerancy. First, clustering
with bit structures in a hypercubic lattice is studied. Spontaneous formation
and destruction of relevant bits are found, which give self-organizing, and
chaotic genetic algorithms. When spontaneous changes of effective couplings are
introduced, chaotic itinerancy of clusterings is widely seen through a feedback
mechanism, which supports dynamic stability allowing for complexity and
diversity, known as homeochaos. Second, synaptic dynamics of couplings is
studied in relation with neural dynamics. The clustering structure is formed
with a balance between external inputs and internal dynamics. Last, an
extension allowing for the growth of the number of elements is given, in
connection with cell differentiation. Effective time sharing system of
resources is formed in partially ordered states.Comment: submitted to Physica D, no figures include
Mechanisms explaining transitions between tonic and phasic firing in neuronal populations as predicted by a low dimensional firing rate model
Several firing patterns experimentally observed in neural populations have
been successfully correlated to animal behavior. Population bursting, hereby
regarded as a period of high firing rate followed by a period of quiescence, is
typically observed in groups of neurons during behavior. Biophysical
membrane-potential models of single cell bursting involve at least three
equations. Extending such models to study the collective behavior of neural
populations involves thousands of equations and can be very expensive
computationally. For this reason, low dimensional population models that
capture biophysical aspects of networks are needed.
\noindent The present paper uses a firing-rate model to study mechanisms that
trigger and stop transitions between tonic and phasic population firing. These
mechanisms are captured through a two-dimensional system, which can potentially
be extended to include interactions between different areas of the nervous
system with a small number of equations. The typical behavior of midbrain
dopaminergic neurons in the rodent is used as an example to illustrate and
interpret our results.
\noindent The model presented here can be used as a building block to study
interactions between networks of neurons. This theoretical approach may help
contextualize and understand the factors involved in regulating burst firing in
populations and how it may modulate distinct aspects of behavior.Comment: 25 pages (including references and appendices); 12 figures uploaded
as separate file
Synchronization of coupled neural oscillators with heterogeneous delays
We investigate the effects of heterogeneous delays in the coupling of two
excitable neural systems. Depending upon the coupling strengths and the time
delays in the mutual and self-coupling, the compound system exhibits different
types of synchronized oscillations of variable period. We analyze this
synchronization based on the interplay of the different time delays and support
the numerical results by analytical findings. In addition, we elaborate on
bursting-like dynamics with two competing timescales on the basis of the
autocorrelation function.Comment: 18 pages, 14 figure
Fluctuations and information filtering in coupled populations of spiking neurons with adaptation
Finite-sized populations of spiking elements are fundamental to brain
function, but also used in many areas of physics. Here we present a theory of
the dynamics of finite-sized populations of spiking units, based on a
quasi-renewal description of neurons with adaptation. We derive an integral
equation with colored noise that governs the stochastic dynamics of the
population activity in response to time-dependent stimulation and calculate the
spectral density in the asynchronous state. We show that systems of coupled
populations with adaptation can generate a frequency band in which sensory
information is preferentially encoded. The theory is applicable to fully as
well as randomly connected networks, and to leaky integrate-and-fire as well as
to generalized spiking neurons with adaptation on multiple time scales
The effect of heterogeneity on decorrelation mechanisms in spiking neural networks: a neuromorphic-hardware study
High-level brain function such as memory, classification or reasoning can be
realized by means of recurrent networks of simplified model neurons. Analog
neuromorphic hardware constitutes a fast and energy efficient substrate for the
implementation of such neural computing architectures in technical applications
and neuroscientific research. The functional performance of neural networks is
often critically dependent on the level of correlations in the neural activity.
In finite networks, correlations are typically inevitable due to shared
presynaptic input. Recent theoretical studies have shown that inhibitory
feedback, abundant in biological neural networks, can actively suppress these
shared-input correlations and thereby enable neurons to fire nearly
independently. For networks of spiking neurons, the decorrelating effect of
inhibitory feedback has so far been explicitly demonstrated only for
homogeneous networks of neurons with linear sub-threshold dynamics. Theory,
however, suggests that the effect is a general phenomenon, present in any
system with sufficient inhibitory feedback, irrespective of the details of the
network structure or the neuronal and synaptic properties. Here, we investigate
the effect of network heterogeneity on correlations in sparse, random networks
of inhibitory neurons with non-linear, conductance-based synapses. Emulations
of these networks on the analog neuromorphic hardware system Spikey allow us to
test the efficiency of decorrelation by inhibitory feedback in the presence of
hardware-specific heterogeneities. The configurability of the hardware
substrate enables us to modulate the extent of heterogeneity in a systematic
manner. We selectively study the effects of shared input and recurrent
connections on correlations in membrane potentials and spike trains. Our
results confirm ...Comment: 20 pages, 10 figures, supplement
Intrinsically-generated fluctuating activity in excitatory-inhibitory networks
Recurrent networks of non-linear units display a variety of dynamical regimes
depending on the structure of their synaptic connectivity. A particularly
remarkable phenomenon is the appearance of strongly fluctuating, chaotic
activity in networks of deterministic, but randomly connected rate units. How
this type of intrinsi- cally generated fluctuations appears in more realistic
networks of spiking neurons has been a long standing question. To ease the
comparison between rate and spiking networks, recent works investigated the
dynami- cal regimes of randomly-connected rate networks with segregated
excitatory and inhibitory populations, and firing rates constrained to be
positive. These works derived general dynamical mean field (DMF) equations
describing the fluctuating dynamics, but solved these equations only in the
case of purely inhibitory networks. Using a simplified excitatory-inhibitory
architecture in which DMF equations are more easily tractable, here we show
that the presence of excitation qualitatively modifies the fluctuating activity
compared to purely inhibitory networks. In presence of excitation,
intrinsically generated fluctuations induce a strong increase in mean firing
rates, a phenomenon that is much weaker in purely inhibitory networks.
Excitation moreover induces two different fluctuating regimes: for moderate
overall coupling, recurrent inhibition is sufficient to stabilize fluctuations,
for strong coupling, firing rates are stabilized solely by the upper bound
imposed on activity, even if inhibition is stronger than excitation. These
results extend to more general network architectures, and to rate networks
receiving noisy inputs mimicking spiking activity. Finally, we show that
signatures of the second dynamical regime appear in networks of
integrate-and-fire neurons
Synchronization and oscillatory dynamics in heterogeneous mutually inhibited neurons
We study some mechanisms responsible for synchronous oscillations and loss of
synchrony at physiologically relevant frequencies (10-200 Hz) in a network of
heterogeneous inhibitory neurons. We focus on the factors that determine the
level of synchrony and frequency of the network response, as well as the
effects of mild heterogeneity on network dynamics. With mild heterogeneity,
synchrony is never perfect and is relatively fragile. In addition, the effects
of inhibition are more complex in mildly heterogeneous networks than in
homogeneous ones. In the former, synchrony is broken in two distinct ways,
depending on the ratio of the synaptic decay time to the period of repetitive
action potentials (), where can be determined either from the
network or from a single, self-inhibiting neuron. With ,
corresponding to large applied current, small synaptic strength or large
synaptic decay time, the effects of inhibition are largely tonic and
heterogeneous neurons spike relatively independently. With ,
synchrony breaks when faster cells begin to suppress their less excitable
neighbors; cells that fire remain nearly synchronous. We show numerically that
the behavior of mildly heterogeneous networks can be related to the behavior of
single, self-inhibiting cells, which can be studied analytically.Comment: 17 pages, 6 figures, Kluwer.sty. Journal of Compuational Neuroscience
(in press). Originally submitted to the neuro-sys archive which was never
publicly announced (was 9802001
- …