4,074 research outputs found
Controlling chaos in diluted networks with continuous neurons
Diluted neural networks with continuous neurons and nonmonotonic transfer
function are studied, with both fixed and dynamic synapses. A noisy stimulus
with periodic variance results in a mechanism for controlling chaos in neural
systems with fixed synapses: a proper amount of external perturbation forces
the system to behave periodically with the same period as the stimulus.Comment: 11 pages, 8 figure
Chaos in neural networks with a nonmonotonic transfer function
Time evolution of diluted neural networks with a nonmonotonic transfer
function is analitically described by flow equations for macroscopic variables.
The macroscopic dynamics shows a rich variety of behaviours: fixed-point,
periodicity and chaos. We examine in detail the structure of the strange
attractor and in particular we study the main features of the stable and
unstable manifolds, the hyperbolicity of the attractor and the existence of
homoclinic intersections. We also discuss the problem of the robustness of the
chaos and we prove that in the present model chaotic behaviour is fragile
(chaotic regions are densely intercalated with periodicity windows), according
to a recently discussed conjecture. Finally we perform an analysis of the
microscopic behaviour and in particular we examine the occurrence of damage
spreading by studying the time evolution of two almost identical initial
configurations. We show that for any choice of the parameters the two initial
states remain microscopically distinct.Comment: 12 pages, 11 figures. Accepted for publication in Physical Review E.
Originally submitted to the neuro-sys archive which was never publicly
announced (was 9905001
Transient Information Flow in a Network of Excitatory and Inhibitory Model Neurons: Role of Noise and Signal Autocorrelation
We investigate the performance of sparsely-connected networks of
integrate-and-fire neurons for ultra-short term information processing. We
exploit the fact that the population activity of networks with balanced
excitation and inhibition can switch from an oscillatory firing regime to a
state of asynchronous irregular firing or quiescence depending on the rate of
external background spikes.
We find that in terms of information buffering the network performs best for
a moderate, non-zero, amount of noise. Analogous to the phenomenon of
stochastic resonance the performance decreases for higher and lower noise
levels. The optimal amount of noise corresponds to the transition zone between
a quiescent state and a regime of stochastic dynamics. This provides a
potential explanation on the role of non-oscillatory population activity in a
simplified model of cortical micro-circuits.Comment: 27 pages, 7 figures, to appear in J. Physiology (Paris) Vol. 9
Dynamic Adaptive Computation: Tuning network states to task requirements
Neural circuits are able to perform computations under very diverse
conditions and requirements. The required computations impose clear constraints
on their fine-tuning: a rapid and maximally informative response to stimuli in
general requires decorrelated baseline neural activity. Such network dynamics
is known as asynchronous-irregular. In contrast, spatio-temporal integration of
information requires maintenance and transfer of stimulus information over
extended time periods. This can be realized at criticality, a phase transition
where correlations, sensitivity and integration time diverge. Being able to
flexibly switch, or even combine the above properties in a task-dependent
manner would present a clear functional advantage. We propose that cortex
operates in a "reverberating regime" because it is particularly favorable for
ready adaptation of computational properties to context and task. This
reverberating regime enables cortical networks to interpolate between the
asynchronous-irregular and the critical state by small changes in effective
synaptic strength or excitation-inhibition ratio. These changes directly adapt
computational properties, including sensitivity, amplification, integration
time and correlation length within the local network. We review recent
converging evidence that cortex in vivo operates in the reverberating regime,
and that various cortical areas have adapted their integration times to
processing requirements. In addition, we propose that neuromodulation enables a
fine-tuning of the network, so that local circuits can either decorrelate or
integrate, and quench or maintain their input depending on task. We argue that
this task-dependent tuning, which we call "dynamic adaptive computation",
presents a central organization principle of cortical networks and discuss
first experimental evidence.Comment: 6 pages + references, 2 figure
Neural networks with dynamical synapses: from mixed-mode oscillations and spindles to chaos
Understanding of short-term synaptic depression (STSD) and other forms of
synaptic plasticity is a topical problem in neuroscience. Here we study the
role of STSD in the formation of complex patterns of brain rhythms. We use a
cortical circuit model of neural networks composed of irregular spiking
excitatory and inhibitory neurons having type 1 and 2 excitability and
stochastic dynamics. In the model, neurons form a sparsely connected network
and their spontaneous activity is driven by random spikes representing synaptic
noise. Using simulations and analytical calculations, we found that if the STSD
is absent, the neural network shows either asynchronous behavior or regular
network oscillations depending on the noise level. In networks with STSD,
changing parameters of synaptic plasticity and the noise level, we observed
transitions to complex patters of collective activity: mixed-mode and spindle
oscillations, bursts of collective activity, and chaotic behaviour.
Interestingly, these patterns are stable in a certain range of the parameters
and separated by critical boundaries. Thus, the parameters of synaptic
plasticity can play a role of control parameters or switchers between different
network states. However, changes of the parameters caused by a disease may lead
to dramatic impairment of ongoing neural activity. We analyze the chaotic
neural activity by use of the 0-1 test for chaos (Gottwald, G. & Melbourne, I.,
2004) and show that it has a collective nature.Comment: 7 pages, Proceedings of 12th Granada Seminar, September 17-21, 201
Death and rebirth of neural activity in sparse inhibitory networks
In this paper, we clarify the mechanisms underlying a general phenomenon
present in pulse-coupled heterogeneous inhibitory networks: inhibition can
induce not only suppression of the neural activity, as expected, but it can
also promote neural reactivation. In particular, for globally coupled systems,
the number of firing neurons monotonically reduces upon increasing the strength
of inhibition (neurons' death). However, the random pruning of the connections
is able to reverse the action of inhibition, i.e. in a sparse network a
sufficiently strong synaptic strength can surprisingly promote, rather than
depress, the activity of the neurons (neurons' rebirth). Thus the number of
firing neurons reveals a minimum at some intermediate synaptic strength. We
show that this minimum signals a transition from a regime dominated by the
neurons with higher firing activity to a phase where all neurons are
effectively sub-threshold and their irregular firing is driven by current
fluctuations. We explain the origin of the transition by deriving an analytic
mean field formulation of the problem able to provide the fraction of active
neurons as well as the first two moments of their firing statistics. The
introduction of a synaptic time scale does not modify the main aspects of the
reported phenomenon. However, for sufficiently slow synapses the transition
becomes dramatic, the system passes from a perfectly regular evolution to an
irregular bursting dynamics. In this latter regime the model provides
predictions consistent with experimental findings for a specific class of
neurons, namely the medium spiny neurons in the striatum.Comment: 19 pages, 10 figures, submitted to NJ
Dynamical principles in neuroscience
Dynamical modeling of neural systems and brain functions has a history of success over the last half century. This includes, for example, the explanation and prediction of some features of neural rhythmic behaviors. Many interesting dynamical models of learning and memory based on physiological experiments have been suggested over the last two decades. Dynamical models even of consciousness now exist. Usually these models and results are based on traditional approaches and paradigms of nonlinear dynamics including dynamical chaos. Neural systems are, however, an unusual subject for nonlinear dynamics for several reasons: (i) Even the simplest neural network, with only a few neurons and synaptic connections, has an enormous number of variables and control parameters. These make neural systems adaptive and flexible, and are critical to their biological function. (ii) In contrast to traditional physical systems described by well-known basic principles, first principles governing the dynamics of neural systems are unknown. (iii) Many different neural systems exhibit similar dynamics despite having different architectures and different levels of complexity. (iv) The network architecture and connection strengths are usually not known in detail and therefore the dynamical analysis must, in some sense, be probabilistic. (v) Since nervous systems are able to organize behavior based on sensory inputs, the dynamical modeling of these systems has to explain the transformation of temporal information into combinatorial or combinatorial-temporal codes, and vice versa, for memory and recognition. In this review these problems are discussed in the context of addressing the stimulating questions: What can neuroscience learn from nonlinear dynamics, and what can nonlinear dynamics learn from neuroscience?This work was supported by NSF Grant No. NSF/EIA-0130708, and Grant No. PHY 0414174; NIH Grant No. 1 R01 NS50945 and Grant No. NS40110; MEC BFI2003-07276, and FundaciĂłn BBVA
- …