30,266 research outputs found
Neural networks with dynamical synapses: from mixed-mode oscillations and spindles to chaos
Understanding of short-term synaptic depression (STSD) and other forms of
synaptic plasticity is a topical problem in neuroscience. Here we study the
role of STSD in the formation of complex patterns of brain rhythms. We use a
cortical circuit model of neural networks composed of irregular spiking
excitatory and inhibitory neurons having type 1 and 2 excitability and
stochastic dynamics. In the model, neurons form a sparsely connected network
and their spontaneous activity is driven by random spikes representing synaptic
noise. Using simulations and analytical calculations, we found that if the STSD
is absent, the neural network shows either asynchronous behavior or regular
network oscillations depending on the noise level. In networks with STSD,
changing parameters of synaptic plasticity and the noise level, we observed
transitions to complex patters of collective activity: mixed-mode and spindle
oscillations, bursts of collective activity, and chaotic behaviour.
Interestingly, these patterns are stable in a certain range of the parameters
and separated by critical boundaries. Thus, the parameters of synaptic
plasticity can play a role of control parameters or switchers between different
network states. However, changes of the parameters caused by a disease may lead
to dramatic impairment of ongoing neural activity. We analyze the chaotic
neural activity by use of the 0-1 test for chaos (Gottwald, G. & Melbourne, I.,
2004) and show that it has a collective nature.Comment: 7 pages, Proceedings of 12th Granada Seminar, September 17-21, 201
Storage of phase-coded patterns via STDP in fully-connected and sparse network: a study of the network capacity
We study the storage and retrieval of phase-coded patterns as stable
dynamical attractors in recurrent neural networks, for both an analog and a
integrate-and-fire spiking model. The synaptic strength is determined by a
learning rule based on spike-time-dependent plasticity, with an asymmetric time
window depending on the relative timing between pre- and post-synaptic
activity. We store multiple patterns and study the network capacity.
For the analog model, we find that the network capacity scales linearly with
the network size, and that both capacity and the oscillation frequency of the
retrieval state depend on the asymmetry of the learning time window. In
addition to fully-connected networks, we study sparse networks, where each
neuron is connected only to a small number z << N of other neurons. Connections
can be short range, between neighboring neurons placed on a regular lattice, or
long range, between randomly chosen pairs of neurons. We find that a small
fraction of long range connections is able to amplify the capacity of the
network. This imply that a small-world-network topology is optimal, as a
compromise between the cost of long range connections and the capacity
increase.
Also in the spiking integrate and fire model the crucial result of storing
and retrieval of multiple phase-coded patterns is observed. The capacity of the
fully-connected spiking network is investigated, together with the relation
between oscillation frequency of retrieval state and window asymmetry
Learning intrinsic excitability in medium spiny neurons
We present an unsupervised, local activation-dependent learning rule for
intrinsic plasticity (IP) which affects the composition of ion channel
conductances for single neurons in a use-dependent way. We use a
single-compartment conductance-based model for medium spiny striatal neurons in
order to show the effects of parametrization of individual ion channels on the
neuronal activation function. We show that parameter changes within the
physiological ranges are sufficient to create an ensemble of neurons with
significantly different activation functions. We emphasize that the effects of
intrinsic neuronal variability on spiking behavior require a distributed mode
of synaptic input and can be eliminated by strongly correlated input. We show
how variability and adaptivity in ion channel conductances can be utilized to
store patterns without an additional contribution by synaptic plasticity (SP).
The adaptation of the spike response may result in either "positive" or
"negative" pattern learning. However, read-out of stored information depends on
a distributed pattern of synaptic activity to let intrinsic variability
determine spike response. We briefly discuss the implications of this
conditional memory on learning and addiction.Comment: 20 pages, 8 figure
Network Plasticity as Bayesian Inference
General results from statistical learning theory suggest to understand not
only brain computations, but also brain plasticity as probabilistic inference.
But a model for that has been missing. We propose that inherently stochastic
features of synaptic plasticity and spine motility enable cortical networks of
neurons to carry out probabilistic inference by sampling from a posterior
distribution of network configurations. This model provides a viable
alternative to existing models that propose convergence of parameters to
maximum likelihood values. It explains how priors on weight distributions and
connection probabilities can be merged optimally with learned experience, how
cortical networks can generalize learned information so well to novel
experiences, and how they can compensate continuously for unforeseen
disturbances of the network. The resulting new theory of network plasticity
explains from a functional perspective a number of experimental data on
stochastic aspects of synaptic plasticity that previously appeared to be quite
puzzling.Comment: 33 pages, 5 figures, the supplement is available on the author's web
page http://www.igi.tugraz.at/kappe
Learning to Discriminate Through Long-Term Changes of Dynamical Synaptic Transmission
Short-term synaptic plasticity is modulated by long-term synaptic
changes. There is, however, no general agreement on the computational
role of this interaction. Here, we derive a learning rule for the release
probability and the maximal synaptic conductance in a circuit model
with combined recurrent and feedforward connections that allows learning
to discriminate among natural inputs. Short-term synaptic plasticity
thereby provides a nonlinear expansion of the input space of a linear
classifier, whereas the random recurrent network serves to decorrelate
the expanded input space. Computer simulations reveal that the twofold
increase in the number of input dimensions through short-term synaptic
plasticity improves the performance of a standard perceptron up to 100%.
The distributions of release probabilities and maximal synaptic conductances
at the capacity limit strongly depend on the balance between excitation
and inhibition. The model also suggests a new computational
interpretation of spikes evoked by stimuli outside the classical receptive
field. These neuronal activitiesmay reflect decorrelation of the expanded
stimulus space by intracortical synaptic connections
A novel plasticity rule can explain the development of sensorimotor intelligence
Grounding autonomous behavior in the nervous system is a fundamental
challenge for neuroscience. In particular, the self-organized behavioral
development provides more questions than answers. Are there special functional
units for curiosity, motivation, and creativity? This paper argues that these
features can be grounded in synaptic plasticity itself, without requiring any
higher level constructs. We propose differential extrinsic plasticity (DEP) as
a new synaptic rule for self-learning systems and apply it to a number of
complex robotic systems as a test case. Without specifying any purpose or goal,
seemingly purposeful and adaptive behavior is developed, displaying a certain
level of sensorimotor intelligence. These surprising results require no system
specific modifications of the DEP rule but arise rather from the underlying
mechanism of spontaneous symmetry breaking due to the tight
brain-body-environment coupling. The new synaptic rule is biologically
plausible and it would be an interesting target for a neurobiolocal
investigation. We also argue that this neuronal mechanism may have been a
catalyst in natural evolution.Comment: 18 pages, 5 figures, 7 video
Complexity without chaos: Plasticity within random recurrent networks generates robust timing and motor control
It is widely accepted that the complex dynamics characteristic of recurrent
neural circuits contributes in a fundamental manner to brain function. Progress
has been slow in understanding and exploiting the computational power of
recurrent dynamics for two main reasons: nonlinear recurrent networks often
exhibit chaotic behavior and most known learning rules do not work in robust
fashion in recurrent networks. Here we address both these problems by
demonstrating how random recurrent networks (RRN) that initially exhibit
chaotic dynamics can be tuned through a supervised learning rule to generate
locally stable neural patterns of activity that are both complex and robust to
noise. The outcome is a novel neural network regime that exhibits both
transiently stable and chaotic trajectories. We further show that the recurrent
learning rule dramatically increases the ability of RRNs to generate complex
spatiotemporal motor patterns, and accounts for recent experimental data
showing a decrease in neural variability in response to stimulus onset
A geographically distributed bio-hybrid neural network with memristive plasticity
Throughout evolution the brain has mastered the art of processing real-world
inputs through networks of interlinked spiking neurons. Synapses have emerged
as key elements that, owing to their plasticity, are merging neuron-to-neuron
signalling with memory storage and computation. Electronics has made important
steps in emulating neurons through neuromorphic circuits and synapses with
nanoscale memristors, yet novel applications that interlink them in
heterogeneous bio-inspired and bio-hybrid architectures are just beginning to
materialise. The use of memristive technologies in brain-inspired architectures
for computing or for sensing spiking activity of biological neurons8 are only
recent examples, however interlinking brain and electronic neurons through
plasticity-driven synaptic elements has remained so far in the realm of the
imagination. Here, we demonstrate a bio-hybrid neural network (bNN) where
memristors work as "synaptors" between rat neural circuits and VLSI neurons.
The two fundamental synaptors, from artificial-to-biological (ABsyn) and from
biological-to- artificial (BAsyn), are interconnected over the Internet. The
bNN extends across Europe, collapsing spatial boundaries existing in natural
brain networks and laying the foundations of a new geographically distributed
and evolving architecture: the Internet of Neuro-electronics (IoN).Comment: 16 pages, 10 figure
Memory and information processing in neuromorphic systems
A striking difference between brain-inspired neuromorphic processors and
current von Neumann processors architectures is the way in which memory and
processing is organized. As Information and Communication Technologies continue
to address the need for increased computational power through the increase of
cores within a digital processor, neuromorphic engineers and scientists can
complement this need by building processor architectures where memory is
distributed with the processing. In this paper we present a survey of
brain-inspired processor architectures that support models of cortical networks
and deep neural networks. These architectures range from serial clocked
implementations of multi-neuron systems to massively parallel asynchronous ones
and from purely digital systems to mixed analog/digital systems which implement
more biological-like models of neurons and synapses together with a suite of
adaptation and learning mechanisms analogous to the ones found in biological
nervous systems. We describe the advantages of the different approaches being
pursued and present the challenges that need to be addressed for building
artificial neural processing systems that can display the richness of behaviors
seen in biological systems.Comment: Submitted to Proceedings of IEEE, review of recently proposed
neuromorphic computing platforms and system
- …