505 research outputs found

    Neural Distributed Autoassociative Memories: A Survey

    Full text link
    Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons). Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints. Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.Comment: 31 page

    Transient dynamics for sequence processing neural networks

    Full text link
    An exact solution of the transient dynamics for a sequential associative memory model is discussed through both the path-integral method and the statistical neurodynamics. Although the path-integral method has the ability to give an exact solution of the transient dynamics, only stationary properties have been discussed for the sequential associative memory. We have succeeded in deriving an exact macroscopic description of the transient dynamics by analyzing the correlation of crosstalk noise. Surprisingly, the order parameter equations of this exact solution are completely equivalent to those of the statistical neurodynamics, which is an approximation theory that assumes crosstalk noise to obey the Gaussian distribution. In order to examine our theoretical findings, we numerically obtain cumulants of the crosstalk noise. We verify that the third- and fourth-order cumulants are equal to zero, and that the crosstalk noise is normally distributed even in the non-retrieval case. We show that the results obtained by our theory agree with those obtained by computer simulations. We have also found that the macroscopic unstable state completely coincides with the separatrix.Comment: 21 pages, 4 figure

    An associative network with spatially organized connectivity

    Full text link
    We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large overlaps with one of the stored patterns. It is also shown, with simulations and analytic results, that the storage capacity does not decrease much when the connectivity of the network becomes short range. In addition, the method used here enables us to calculate exactly the storage capacity of a randomly connected network with arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA

    Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines

    Get PDF
    In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns.Comment: 16 pages, 7 figures; Appears in Entropy, Special Issue "Information Geometry II

    Localized activity profiles and storage capacity of rate-based autoassociative networks

    Full text link
    We study analytically the effect of metrically structured connectivity on the behavior of autoassociative networks. We focus on three simple rate-based model neurons: threshold-linear, binary or smoothly saturating units. For a connectivity which is short range enough the threshold-linear network shows localized retrieval states. The saturating and binary models also exhibit spatially modulated retrieval states if the highest activity level that they can achieve is above the maximum activity of the units in the stored patterns. In the zero quenched noise limit, we derive an analytical formula for the critical value of the connectivity width below which one observes spatially non-uniform retrieval states. Localization reduces storage capacity, but only by a factor of 2~3. The approach that we present here is generic in the sense that there are no specific assumptions on the single unit input-output function nor on the exact connectivity structure.Comment: 4 pages, 4 figure
    corecore