1,089 research outputs found

    Neural Distributed Autoassociative Memories: A Survey

    Full text link
    Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons). Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints. Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.Comment: 31 page

    Implementing the SCAN language by neural networks

    Get PDF
    The real-time encryption of pictures is an important subject for many applications, e.g. television broadcast stations, network security, etc. The paper shows how the previously introduced SCAN encryption method can be easily implemented using binary neural network autoassociative memory

    Improved Autoassociative Neural Networks

    Get PDF
    Improved autoassociative neural networks, denoted nexi, have been proposed for use in controlling autonomous robots, including mobile exploratory robots of the biomorphic type. In comparison with conventional autoassociative neural networks, nexi would be more complex but more capable in that they could be trained to do more complex tasks. A nexus would use bit weights and simple arithmetic in a manner that would enable training and operation without a central processing unit, programs, weight registers, or large amounts of memory. Only a relatively small amount of memory (to hold the bit weights) and a simple logic application- specific integrated circuit would be needed. A description of autoassociative neural networks is prerequisite to a meaningful description of a nexus. An autoassociative network is a set of neurons that are completely connected in the sense that each neuron receives input from, and sends output to, all the other neurons. (In some instantiations, a neuron could also send output back to its own input terminal.) The state of a neuron is completely determined by the inner product of its inputs with weights associated with its input channel. Setting the weights sets the behavior of the network. The neurons of an autoassociative network are usually regarded as comprising a row or vector. Time is a quantized phenomenon for most autoassociative networks in the sense that time proceeds in discrete steps. At each time step, the row of neurons forms a pattern: some neurons are firing, some are not. Hence, the current state of an autoassociative network can be described with a single binary vector. As time goes by, the network changes the vector. Autoassociative networks move vectors over hyperspace landscapes of possibilities

    Transient dynamics for sequence processing neural networks

    Full text link
    An exact solution of the transient dynamics for a sequential associative memory model is discussed through both the path-integral method and the statistical neurodynamics. Although the path-integral method has the ability to give an exact solution of the transient dynamics, only stationary properties have been discussed for the sequential associative memory. We have succeeded in deriving an exact macroscopic description of the transient dynamics by analyzing the correlation of crosstalk noise. Surprisingly, the order parameter equations of this exact solution are completely equivalent to those of the statistical neurodynamics, which is an approximation theory that assumes crosstalk noise to obey the Gaussian distribution. In order to examine our theoretical findings, we numerically obtain cumulants of the crosstalk noise. We verify that the third- and fourth-order cumulants are equal to zero, and that the crosstalk noise is normally distributed even in the non-retrieval case. We show that the results obtained by our theory agree with those obtained by computer simulations. We have also found that the macroscopic unstable state completely coincides with the separatrix.Comment: 21 pages, 4 figure

    An associative network with spatially organized connectivity

    Full text link
    We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large overlaps with one of the stored patterns. It is also shown, with simulations and analytic results, that the storage capacity does not decrease much when the connectivity of the network becomes short range. In addition, the method used here enables us to calculate exactly the storage capacity of a randomly connected network with arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA

    Stochastic transitions of attractors in associative memory models with correlated noise

    Full text link
    We investigate dynamics of recurrent neural networks with correlated noise to analyze the noise's effect. The mechanism of correlated firing has been analyzed in various models, but its functional roles have not been discussed in sufficient detail. Aoyagi and Aoki have shown that the state transition of a network is invoked by synchronous spikes. We introduce two types of noise to each neuron: thermal independent noise and correlated noise. Due to the effects of correlated noise, the correlation between neural inputs cannot be ignored, so the behavior of the network has sample dependence. We discuss two types of associative memory models: one with auto- and weak cross-correlation connections and one with hierarchically correlated patterns. The former is similar in structure to Aoyagi and Aoki's model. We show that stochastic transition can be presented by correlated rather than thermal noise. In the latter, we show stochastic transition from a memory state to a mixture state using correlated noise. To analyze the stochastic transitions, we derive a macroscopic dynamic description as a recurrence relation form of a probability density function when the correlated noise exists. Computer simulations agree with theoretical results.Comment: 21 page

    Comparing Information-Theoretic Measures of Complexity in Boltzmann Machines

    Get PDF
    In the past three decades, many theoretical measures of complexity have been proposed to help understand complex systems. In this work, for the first time, we place these measures on a level playing field, to explore the qualitative similarities and differences between them, and their shortcomings. Specifically, using the Boltzmann machine architecture (a fully connected recurrent neural network) with uniformly distributed weights as our model of study, we numerically measure how complexity changes as a function of network dynamics and network parameters. We apply an extension of one such information-theoretic measure of complexity to understand incremental Hebbian learning in Hopfield networks, a fully recurrent architecture model of autoassociative memory. In the course of Hebbian learning, the total information flow reflects a natural upward trend in complexity as the network attempts to learn more and more patterns.Comment: 16 pages, 7 figures; Appears in Entropy, Special Issue "Information Geometry II

    Localized activity profiles and storage capacity of rate-based autoassociative networks

    Full text link
    We study analytically the effect of metrically structured connectivity on the behavior of autoassociative networks. We focus on three simple rate-based model neurons: threshold-linear, binary or smoothly saturating units. For a connectivity which is short range enough the threshold-linear network shows localized retrieval states. The saturating and binary models also exhibit spatially modulated retrieval states if the highest activity level that they can achieve is above the maximum activity of the units in the stored patterns. In the zero quenched noise limit, we derive an analytical formula for the critical value of the connectivity width below which one observes spatially non-uniform retrieval states. Localization reduces storage capacity, but only by a factor of 2~3. The approach that we present here is generic in the sense that there are no specific assumptions on the single unit input-output function nor on the exact connectivity structure.Comment: 4 pages, 4 figure
    corecore