8,121 research outputs found

    Maximum Likelihood Associative Memories

    Full text link
    Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amount of memory required to store the same data. Finally, we bound the computational complexity for message retrieval. We then compare these bounds with two existing associative memory architectures: the celebrated Hopfield neural networks and a neural network architecture introduced more recently by Gripon and Berrou

    Comment on "Probabilistic Quantum Memories"

    Full text link
    This is a comment on two wrong Phys. Rev. Letters papers by C.A. Trugenberger. Trugenberger claimed that quantum registers could be used as exponentially large "associative" memories. We show that his scheme is no better than one where the quantum register is replaced with a classical one of equal size. We also point out that the Holevo bound and more recent bounds on "quantum random access codes" pretty much rule out powerful memories (for classical information) based on quantum states.Comment: REVTeX4, 1 page, published versio

    A Comparative Study of Sparse Associative Memories

    Full text link
    We study various models of associative memories with sparse information, i.e. a pattern to be stored is a random string of 00s and 11s with about logN\log N 11s, only. We compare different synaptic weights, architectures and retrieval mechanisms to shed light on the influence of the various parameters on the storage capacity.Comment: 28 pages, 2 figure

    Adiabatic Quantum Optimization for Associative Memory Recall

    Get PDF
    Hopfield networks are a variant of associative memory that recall information stored in the couplings of an Ising model. Stored memories are fixed points for the network dynamics that correspond to energetic minima of the spin state. We formulate the recall of memories stored in a Hopfield network using energy minimization by adiabatic quantum optimization (AQO). Numerical simulations of the quantum dynamics allow us to quantify the AQO recall accuracy with respect to the number of stored memories and the noise in the input key. We also investigate AQO performance with respect to how memories are stored in the Ising model using different learning rules. Our results indicate that AQO performance varies strongly with learning rule due to the changes in the energy landscape. Consequently, learning rules offer indirect methods for investigating change to the computational complexity of the recall task and the computational efficiency of AQO.Comment: 22 pages, 11 figures. Updated for clarity and figures, to appear in Frontiers of Physic

    Neural Distributed Autoassociative Memories: A Survey

    Full text link
    Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons). Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints. Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.Comment: 31 page

    The sparse Blume-Emery-Griffiths model of associative memories

    Full text link
    We analyze the Blume-Emery-Griffiths (BEG) associative memory with sparse patterns and at zero temperature. We give bounds on its storage capacity provided that we want the stored patterns to be fixed points of the retrieval dynamics. We compare our results to that of other models of sparse neural networks and show that the BEG model has a superior performance compared to them.Comment: 23 p

    Harmonic analysis of neural networks

    Get PDF
    Neural networks models have attracted a lot of interest in recent years mainly because there were perceived as a new idea for computing. These models can be described as a network in which every node computes a linear threshold function. One of the main difficulties in analyzing the properties of these networks is the fact that they consist of nonlinear elements. I will present a novel approach, based on harmonic analysis of Boolean functions, to analyze neural networks. In particular I will show how this technique can be applied to answer the following two fundamental questions (i) what is the computational power of a polynomial threshold element with respect to linear threshold elements? (ii) Is it possible to get exponentially many spurious memories when we use the outer-product method for programming the Hopfield model

    Storage capacity of holographic associative memories

    Get PDF
    The storage capacity of holographic associative memories is estimated. An argument based on the available degrees of freedom shows that the number of patterns that can be stored is limited by the space-bandwidth product of the hologram divided by the number of pixels in each pattern. A statistical calculation shows that if we attempt to store associations by multiply exposing the hologram, the cross talk among the stored items severely degrades the output fidelity. This confirms the storage capacity predicted by the degrees-of-freedom argument
    corecore