602 research outputs found
Memory Capacity of a novel optical neural net architecture
A new associative memory neural network which can be constructed using optical matched filters is described. It has three layers, the centre one being iterative with its weights set prior to training. The other two layers are feedforward nets and the weights are set during training. The best choice of central layer weights, or in optical terms, of pairs of images associated in a hologram is considered. The stored images or codes are selected carefully form an orthogonal set using a novel algorithm. This enables the net to have a high memory capacity equal to half the umber of neurons with a low probability of error. 17-18th October 1989
Dreaming neural networks: forgetting spurious memories and reinforcing pure ones
The standard Hopfield model for associative neural networks accounts for
biological Hebbian learning and acts as the harmonic oscillator for pattern
recognition, however its maximal storage capacity is , far
from the theoretical bound for symmetric networks, i.e. . Inspired
by sleeping and dreaming mechanisms in mammal brains, we propose an extension
of this model displaying the standard on-line (awake) learning mechanism (that
allows the storage of external information in terms of patterns) and an
off-line (sleep) unlearningconsolidating mechanism (that allows
spurious-pattern removal and pure-pattern reinforcement): this obtained daily
prescription is able to saturate the theoretical bound , remaining
also extremely robust against thermal noise. Both neural and synaptic features
are analyzed both analytically and numerically. In particular, beyond obtaining
a phase diagram for neural dynamics, we focus on synaptic plasticity and we
give explicit prescriptions on the temporal evolution of the synaptic matrix.
We analytically prove that our algorithm makes the Hebbian kernel converge with
high probability to the projection matrix built over the pure stored patterns.
Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in
order to ensure such a convergence. Finally, we run extensive numerical
simulations (mainly Monte Carlo sampling) to check the approximations
underlying the analytical investigations (e.g., we developed the whole theory
at the so called replica-symmetric level, as standard in the
Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size
effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure
On the Storage Capacity of the Hopfield Model
We give a review on the rigorous results concerning the storage capacity of the Hopfield model. We distinguish between two different concepts of storage both of them guided by the idea that the retrieval dynamics is a Monte-Carlo dynamics (possibly at zero temperature). We recall the results of McEliece et al. [MPRV87] as well as those by Newman [N88] for the storage capacity of the Hopfield model with unbiased i.i.d. patterns and comprehend some recent development concerning the Hopfield model with semantically correlated or biased patterns
Hierarchical neural networks perform both serial and parallel processing
In this work we study a Hebbian neural network, where neurons are arranged
according to a hierarchical architecture such that their couplings scale with
their reciprocal distance. As a full statistical mechanics solution is not yet
available, after a streamlined introduction to the state of the art via that
route, the problem is consistently approached through signal- to-noise
technique and extensive numerical simulations. Focusing on the low-storage
regime, where the amount of stored patterns grows at most logarithmical with
the system size, we prove that these non-mean-field Hopfield-like networks
display a richer phase diagram than their classical counterparts. In
particular, these networks are able to perform serial processing (i.e. retrieve
one pattern at a time through a complete rearrangement of the whole ensemble of
neurons) as well as parallel processing (i.e. retrieve several patterns
simultaneously, delegating the management of diff erent patterns to diverse
communities that build network). The tune between the two regimes is given by
the rate of the coupling decay and by the level of noise affecting the system.
The price to pay for those remarkable capabilities lies in a network's capacity
smaller than the mean field counterpart, thus yielding a new budget principle:
the wider the multitasking capabilities, the lower the network load and
viceversa. This may have important implications in our understanding of
biological complexity
- âŠ