504 research outputs found
Event-driven simulations of a plastic, spiking neural network
We consider a fully-connected network of leaky integrate-and-fire neurons
with spike-timing-dependent plasticity. The plasticity is controlled by a
parameter representing the expected weight of a synapse between neurons that
are firing randomly with the same mean frequency. For low values of the
plasticity parameter, the activities of the system are dominated by noise,
while large values of the plasticity parameter lead to self-sustaining activity
in the network. We perform event-driven simulations on finite-size networks
with up to 128 neurons to find the stationary synaptic weight conformations for
different values of the plasticity parameter. In both the low and high activity
regimes, the synaptic weights are narrowly distributed around the plasticity
parameter value consistent with the predictions of mean-field theory. However,
the distribution broadens in the transition region between the two regimes,
representing emergent network structures. Using a pseudophysical approach for
visualization, we show that the emergent structures are of "path" or "hub"
type, observed at different values of the plasticity parameter in the
transition region.Comment: 9 pages, 6 figure
Influence of synaptic depression on memory storage capacity
Synaptic efficacy between neurons is known to change within a short time
scale dynamically. Neurophysiological experiments show that high-frequency
presynaptic inputs decrease synaptic efficacy between neurons. This phenomenon
is called synaptic depression, a short term synaptic plasticity. Many
researchers have investigated how the synaptic depression affects the memory
storage capacity. However, the noise has not been taken into consideration in
their analysis. By introducing "temperature", which controls the level of the
noise, into an update rule of neurons, we investigate the effects of synaptic
depression on the memory storage capacity in the presence of the noise. We
analytically compute the storage capacity by using a statistical mechanics
technique called Self Consistent Signal to Noise Analysis (SCSNA). We find that
the synaptic depression decreases the storage capacity in the case of finite
temperature in contrast to the case of the low temperature limit, where the
storage capacity does not change
Adaptive self-organization in a realistic neural network model
Information processing in complex systems is often found to be maximally
efficient close to critical states associated with phase transitions. It is
therefore conceivable that also neural information processing operates close to
criticality. This is further supported by the observation of power-law
distributions, which are a hallmark of phase transitions. An important open
question is how neural networks could remain close to a critical point while
undergoing a continual change in the course of development, adaptation,
learning, and more. An influential contribution was made by Bornholdt and
Rohlf, introducing a generic mechanism of robust self-organized criticality in
adaptive networks. Here, we address the question whether this mechanism is
relevant for real neural networks. We show in a realistic model that
spike-time-dependent synaptic plasticity can self-organize neural networks
robustly toward criticality. Our model reproduces several empirical
observations and makes testable predictions on the distribution of synaptic
strength, relating them to the critical state of the network. These results
suggest that the interplay between dynamics and topology may be essential for
neural information processing.Comment: 6 pages, 4 figure
Continuous Attractors with Morphed/Correlated Maps
Continuous attractor networks are used to model the storage and representation of analog quantities, such as position of a visual stimulus. The storage of multiple continuous attractors in the same network has previously been studied in the context of self-position coding. Several uncorrelated maps of environments are stored in the synaptic connections, and a position in a given environment is represented by a localized pattern of neural activity in the corresponding map, driven by a spatially tuned input. Here we analyze networks storing a pair of correlated maps, or a morph sequence between two uncorrelated maps. We find a novel state in which the network activity is simultaneously localized in both maps. In this state, a fixed cue presented to the network does not determine uniquely the location of the bump, i.e. the response is unreliable, with neurons not always responding when their preferred input is present. When the tuned input varies smoothly in time, the neuronal responses become reliable and selective for the environment: the subset of neurons responsive to a moving input in one map changes almost completely in the other map. This form of remapping is a non-trivial transformation between the tuned input to the network and the resulting tuning curves of the neurons. The new state of the network could be related to the formation of direction selectivity in one-dimensional environments and hippocampal remapping. The applicability of the model is not confined to self-position representations; we show an instance of the network solving a simple delayed discrimination task
Self-control in Sparsely Coded Networks
A complete self-control mechanism is proposed in the dynamics of neural
networks through the introduction of a time-dependent threshold, determined in
function of both the noise and the pattern activity in the network. Especially
for sparsely coded models this mechanism is shown to considerably improve the
storage capacity, the basins of attraction and the mutual information content
of the network.Comment: 4 pages, 6 Postscript figure
Pulse-coupled relaxation oscillators: from biological synchronization to Self-Organized Criticality
It is shown that globally-coupled oscillators with pulse interaction can
synchronize under broader conditions than widely believed from a theorem of
Mirollo \& Strogatz \cite{MirolloII}. This behavior is stable against frozen
disorder. Beside the relevance to biology, it is argued that synchronization in
relaxation oscillator models is related to Self-Organized Criticality in
Stick-Slip-like models.Comment: 4 pages, RevTeX, 1 uuencoded postscript figure in separate file,
accepted for publication in Phys. Rev. Lett
A unifying principle underlying the extracellular field potential spectral responses in the human cortex
Electrophysiological mass potentials show complex spectral changes upon neuronal activation. However, it is unknown to what extent these complex band-limited changes are interrelated or, alternatively, reflect separate neuronal processes. To address this question, intracranial electrocorticograms (ECoG) responses were recorded in patients engaged in visuomotor tasks. We found that in the 10- to 100-Hz frequency range there was a significant reduction in the exponent chi of the 1/f(chi) component of the spectrum associated with neuronal activation. In a minority of electrodes showing particularly high activations the exponent reduction was associated with specific band-limited power modulations: emergence of a high gamma (80-100 Hz) and a decrease in the alpha (9-12 Hz) peaks. Importantly, the peaks\u27 height was correlated with the 1/f(chi) exponent on activation. Control simulation ruled out the possibility that the change in 1/f(chi) exponent was a consequence of the analysis procedure. These results reveal a new global, cross-frequency (10-100 Hz) neuronal process reflected in a significant reduction of the power spectrum slope of the ECoG signal
Analysis of Oscillator Neural Networks for Sparsely Coded Phase Patterns
We study a simple extended model of oscillator neural networks capable of
storing sparsely coded phase patterns, in which information is encoded both in
the mean firing rate and in the timing of spikes. Applying the methods of
statistical neurodynamics to our model, we theoretically investigate the
model's associative memory capability by evaluating its maximum storage
capacities and deriving its basins of attraction. It is shown that, as in the
Hopfield model, the storage capacity diverges as the activity level decreases.
We consider various practically and theoretically important cases. For example,
it is revealed that a dynamically adjusted threshold mechanism enhances the
retrieval ability of the associative memory. It is also found that, under
suitable conditions, the network can recall patterns even in the case that
patterns with different activity levels are stored at the same time. In
addition, we examine the robustness with respect to damage of the synaptic
connections. The validity of these theoretical results is confirmed by
reasonable agreement with numerical simulations.Comment: 23 pages, 11 figure
- …