185 research outputs found
Adaptive self-organization in a realistic neural network model
Information processing in complex systems is often found to be maximally
efficient close to critical states associated with phase transitions. It is
therefore conceivable that also neural information processing operates close to
criticality. This is further supported by the observation of power-law
distributions, which are a hallmark of phase transitions. An important open
question is how neural networks could remain close to a critical point while
undergoing a continual change in the course of development, adaptation,
learning, and more. An influential contribution was made by Bornholdt and
Rohlf, introducing a generic mechanism of robust self-organized criticality in
adaptive networks. Here, we address the question whether this mechanism is
relevant for real neural networks. We show in a realistic model that
spike-time-dependent synaptic plasticity can self-organize neural networks
robustly toward criticality. Our model reproduces several empirical
observations and makes testable predictions on the distribution of synaptic
strength, relating them to the critical state of the network. These results
suggest that the interplay between dynamics and topology may be essential for
neural information processing.Comment: 6 pages, 4 figure
Self-control in Sparsely Coded Networks
A complete self-control mechanism is proposed in the dynamics of neural
networks through the introduction of a time-dependent threshold, determined in
function of both the noise and the pattern activity in the network. Especially
for sparsely coded models this mechanism is shown to considerably improve the
storage capacity, the basins of attraction and the mutual information content
of the network.Comment: 4 pages, 6 Postscript figure
Mean-field theory of globally coupled integrate-and-fire neural oscillators with dynamic synapses
This is a pre-print. The definitive version: BRESSLOFF, P.C., 1999. Mean-field theory of globally coupled integrate-and-fire neural oscillators with dynamic synapses. Physical Review E, 60(2), pp.2160-2170 Part B, is available at: http://pre.aps.org/.We analyze the effects of synaptic depression or facilitation on the existence
and stability of the splay or asynchronous state in a population of all-to-all,
pulse-coupled neural oscillators. We use mean-field techniques to derive
conditions for the local stability of the splay state and determine how stability
depends on the degree of synaptic depression or facilitation. We also consider
the effects of noise. Extensions of the mean-field results to finite networks are
developed in terms of the nonlinear firing time map
Analysis of Oscillator Neural Networks for Sparsely Coded Phase Patterns
We study a simple extended model of oscillator neural networks capable of
storing sparsely coded phase patterns, in which information is encoded both in
the mean firing rate and in the timing of spikes. Applying the methods of
statistical neurodynamics to our model, we theoretically investigate the
model's associative memory capability by evaluating its maximum storage
capacities and deriving its basins of attraction. It is shown that, as in the
Hopfield model, the storage capacity diverges as the activity level decreases.
We consider various practically and theoretically important cases. For example,
it is revealed that a dynamically adjusted threshold mechanism enhances the
retrieval ability of the associative memory. It is also found that, under
suitable conditions, the network can recall patterns even in the case that
patterns with different activity levels are stored at the same time. In
addition, we examine the robustness with respect to damage of the synaptic
connections. The validity of these theoretical results is confirmed by
reasonable agreement with numerical simulations.Comment: 23 pages, 11 figure
Noise Induced Coherence in Neural Networks
We investigate numerically the dynamics of large networks of globally
pulse-coupled integrate and fire neurons in a noise-induced synchronized state.
The powerspectrum of an individual element within the network is shown to
exhibit in the thermodynamic limit () a broadband peak and an
additional delta-function peak that is absent from the powerspectrum of an
isolated element. The powerspectrum of the mean output signal only exhibits the
delta-function peak. These results are explained analytically in an exactly
soluble oscillator model with global phase coupling.Comment: 4 pages ReVTeX and 3 postscript figure
The storage capacity of Potts models for semantic memory retrieval
We introduce and analyze a minimal network model of semantic memory in the
human brain. The model is a global associative memory structured as a
collection of N local modules, each coding a feature, which can take S possible
values, with a global sparseness a (the average fraction of features describing
a concept). We show that, under optimal conditions, the number c of modules
connected on average to a module can range widely between very sparse
connectivity (c/N -> 0) and full connectivity (c = N), maintaining a global
network storage capacity (the maximum number p of stored and retrievable
concepts) that scales like c*S^2/a, with logarithmic corrections consistent
with the constraint that each synapse may store up to a fraction of a bit.Comment: Accepted for publication in J-STAT, July 200
Stochastic learning in a neural network with adapting synapses
We consider a neural network with adapting synapses whose dynamics can be
analitically computed. The model is made of neurons and each of them is
connected to input neurons chosen at random in the network. The synapses
are -states variables which evolve in time according to Stochastic Learning
rules; a parallel stochastic dynamics is assumed for neurons. Since the network
maintains the same dynamics whether it is engaged in computation or in learning
new memories, a very low probability of synaptic transitions is assumed. In the
limit with large and finite, the correlations of neurons and
synapses can be neglected and the dynamics can be analitically calculated by
flow equations for the macroscopic parameters of the system.Comment: 25 pages, LaTeX fil
An associative network with spatially organized connectivity
We investigate the properties of an autoassociative network of
threshold-linear units whose synaptic connectivity is spatially structured and
asymmetric. Since the methods of equilibrium statistical mechanics cannot be
applied to such a network due to the lack of a Hamiltonian, we approach the
problem through a signal-to-noise analysis, that we adapt to spatially
organized networks. The conditions are analyzed for the appearance of stable,
spatially non-uniform profiles of activity with large overlaps with one of the
stored patterns. It is also shown, with simulations and analytic results, that
the storage capacity does not decrease much when the connectivity of the
network becomes short range. In addition, the method used here enables us to
calculate exactly the storage capacity of a randomly connected network with
arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA
Neural network model of the primary visual cortex: From functional architecture to lateral connectivity and back
The role of intrinsic cortical dynamics is a debatable issue. A recent optical imaging study (Kenet etĀ al., 2003) found that activity patterns similar to orientation maps (OMs), emerge in the primary visual cortex (V1) even in the absence of sensory input, suggesting an intrinsic mechanism of OM activation. To better understand these results and shed light on the intrinsic V1 processing, we suggest a neural network model in which OMs are encoded by the intrinsic lateral connections. The proposed connectivity pattern depends on the preferred orientation and, unlike previous models, on the degree of orientation selectivity of the interconnected neurons. We prove that the network has a ring attractor composed of an approximated version of the OMs. Consequently, OMs emerge spontaneously when the network is presented with an unstructured noisy input. Simulations show that the model can be applied to experimental data and generate realistic OMs. We study a variation of the model with spatially restricted connections, and show that it gives rise to states composed of several OMs. We hypothesize that these states can represent local properties of the visual scene
- ā¦