1,625 research outputs found
Replica Symmetry Breaking in Attractor Neural Network Models
The phenomenon of replica symmetry breaking is investigated for the retrieval
phases of Hopfield-type network models. The basic calculation is done for the
generalized version of the standard model introduced by Horner [1] and by
Perez-Vicente and Amit [2] which can exhibit low mean levels of neural
activity. For a mean activity the Hopfield model is recovered. In
this case, surprisingly enough, we cannot confirm the well known one step
replica symmetry breaking (1RSB) result for the storage capacity which was
presented by Crisanti, Amit and Gutfreund [3] (\alpha_c^{\hbox{\mf
1RSB}}\simeq 0.144). Rather, we find that 1RSB- and 2RSB-Ans\"atze yield only
slightly increased capacities as compared to the replica symmetric value
(\alpha_c^{\hbox{\mf 1RSB}}\simeq 0.138\,186 and \alpha_c^{\hbox{\mf
2RSB}}\simeq 0.138\,187 compared to \alpha_c^{\hbox{\mf RS}}\simeq
0.137\,905), significantly smaller also than the value \alpha_c^{\hbox{\mf
sim}} = 0.145\pm 0.009 reported from simulation studies. These values still
lie within the recently discovered reentrant phase [4]. We conjecture that in
the infinite Parisi-scheme the reentrant behaviour disappears as is the case in
the SK-spin-glass model (Parisi--Toulouse-hypothesis). The same qualitative
results are obtained in the low activity range.Comment: Latex file, 20 pages, 8 Figures available from the authors upon
request, HD-TVP-94-
Dreaming neural networks: forgetting spurious memories and reinforcing pure ones
The standard Hopfield model for associative neural networks accounts for
biological Hebbian learning and acts as the harmonic oscillator for pattern
recognition, however its maximal storage capacity is , far
from the theoretical bound for symmetric networks, i.e. . Inspired
by sleeping and dreaming mechanisms in mammal brains, we propose an extension
of this model displaying the standard on-line (awake) learning mechanism (that
allows the storage of external information in terms of patterns) and an
off-line (sleep) unlearningconsolidating mechanism (that allows
spurious-pattern removal and pure-pattern reinforcement): this obtained daily
prescription is able to saturate the theoretical bound , remaining
also extremely robust against thermal noise. Both neural and synaptic features
are analyzed both analytically and numerically. In particular, beyond obtaining
a phase diagram for neural dynamics, we focus on synaptic plasticity and we
give explicit prescriptions on the temporal evolution of the synaptic matrix.
We analytically prove that our algorithm makes the Hebbian kernel converge with
high probability to the projection matrix built over the pure stored patterns.
Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in
order to ensure such a convergence. Finally, we run extensive numerical
simulations (mainly Monte Carlo sampling) to check the approximations
underlying the analytical investigations (e.g., we developed the whole theory
at the so called replica-symmetric level, as standard in the
Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size
effects, finding overall full agreement with the theory.Comment: 31 pages, 12 figure
Neural Networks retrieving Boolean patterns in a sea of Gaussian ones
Restricted Boltzmann Machines are key tools in Machine Learning and are
described by the energy function of bipartite spin-glasses. From a statistical
mechanical perspective, they share the same Gibbs measure of Hopfield networks
for associative memory. In this equivalence, weights in the former play as
patterns in the latter. As Boltzmann machines usually require real weights to
be trained with gradient descent like methods, while Hopfield networks
typically store binary patterns to be able to retrieve, the investigation of a
mixed Hebbian network, equipped with both real (e.g., Gaussian) and discrete
(e.g., Boolean) patterns naturally arises. We prove that, in the challenging
regime of a high storage of real patterns, where retrieval is forbidden, an
extra load of Boolean patterns can still be retrieved, as long as the ratio
among the overall load and the network size does not exceed a critical
threshold, that turns out to be the same of the standard
Amit-Gutfreund-Sompolinsky theory. Assuming replica symmetry, we study the case
of a low load of Boolean patterns combining the stochastic stability and
Hamilton-Jacobi interpolating techniques. The result can be extended to the
high load by a non rigorous but standard replica computation argument.Comment: 16 pages, 1 figur
Neural Distributed Autoassociative Memories: A Survey
Introduction. Neural network models of autoassociative, distributed memory
allow storage and retrieval of many items (vectors) where the number of stored
items can exceed the vector dimension (the number of neurons in the network).
This opens the possibility of a sublinear time search (in the number of stored
items) for approximate nearest neighbors among vectors of high dimension. The
purpose of this paper is to review models of autoassociative, distributed
memory that can be naturally implemented by neural networks (mainly with local
learning rules and iterative dynamics based on information locally available to
neurons). Scope. The survey is focused mainly on the networks of Hopfield,
Willshaw and Potts, that have connections between pairs of neurons and operate
on sparse binary vectors. We discuss not only autoassociative memory, but also
the generalization properties of these networks. We also consider neural
networks with higher-order connections and networks with a bipartite graph
structure for non-binary data with linear constraints. Conclusions. In
conclusion we discuss the relations to similarity search, advantages and
drawbacks of these techniques, and topics for further research. An interesting
and still not completely resolved question is whether neural autoassociative
memories can search for approximate nearest neighbors faster than other index
structures for similarity search, in particular for the case of very high
dimensional vectors.Comment: 31 page
Analysis of Oscillator Neural Networks for Sparsely Coded Phase Patterns
We study a simple extended model of oscillator neural networks capable of
storing sparsely coded phase patterns, in which information is encoded both in
the mean firing rate and in the timing of spikes. Applying the methods of
statistical neurodynamics to our model, we theoretically investigate the
model's associative memory capability by evaluating its maximum storage
capacities and deriving its basins of attraction. It is shown that, as in the
Hopfield model, the storage capacity diverges as the activity level decreases.
We consider various practically and theoretically important cases. For example,
it is revealed that a dynamically adjusted threshold mechanism enhances the
retrieval ability of the associative memory. It is also found that, under
suitable conditions, the network can recall patterns even in the case that
patterns with different activity levels are stored at the same time. In
addition, we examine the robustness with respect to damage of the synaptic
connections. The validity of these theoretical results is confirmed by
reasonable agreement with numerical simulations.Comment: 23 pages, 11 figure
Non-Convex Multi-species Hopfield models
In this work we introduce a multi-species generalization of the Hopfield
model for associative memory, where neurons are divided into groups and both
inter-groups and intra-groups pair-wise interactions are considered, with
different intensities. Thus, this system contains two of the main ingredients
of modern Deep neural network architectures: Hebbian interactions to store
patterns of information and multiple layers coding different levels of
correlations. The model is completely solvable in the low-load regime with a
suitable generalization of the Hamilton-Jacobi technique, despite the
Hamiltonian can be a non-definite quadratic form of the magnetizations. The
family of multi-species Hopfield model includes, as special cases, the 3-layers
Restricted Boltzmann Machine (RBM) with Gaussian hidden layer and the
Bidirectional Associative Memory (BAM) model.Comment: This is a pre-print of an article published in J. Stat. Phy
Statistical physics of neural systems with non-additive dendritic coupling
How neurons process their inputs crucially determines the dynamics of
biological and artificial neural networks. In such neural and neural-like
systems, synaptic input is typically considered to be merely transmitted
linearly or sublinearly by the dendritic compartments. Yet, single-neuron
experiments report pronounced supralinear dendritic summation of sufficiently
synchronous and spatially close-by inputs. Here, we provide a statistical
physics approach to study the impact of such non-additive dendritic processing
on single neuron responses and the performance of associative memory tasks in
artificial neural networks. First, we compute the effect of random input to a
neuron incorporating nonlinear dendrites. This approach is independent of the
details of the neuronal dynamics. Second, we use those results to study the
impact of dendritic nonlinearities on the network dynamics in a paradigmatic
model for associative memory, both numerically and analytically. We find that
dendritic nonlinearities maintain network convergence and increase the
robustness of memory performance against noise. Interestingly, an intermediate
number of dendritic branches is optimal for memory functionality
Phase Transitions of an Oscillator Neural Network with a Standard Hebb Learning Rule
Studies have been made on the phase transition phenomena of an oscillator
network model based on a standard Hebb learning rule like the Hopfield model.
The relative phase informations---the in-phase and anti-phase, can be embedded
in the network. By self-consistent signal-to-noise analysis (SCSNA), it was
found that the storage capacity is given by , which is better
than that of Cook's model. However, the retrieval quality is worse. In
addition, an investigation was made into an acceleration effect caused by
asymmetry of the phase dynamics. Finally, it was numerically shown that the
storage capacity can be improved by modifying the shape of the coupling
function.Comment: 10 pages, 6 figure
- …