11,022 research outputs found

    Neural Distributed Autoassociative Memories: A Survey

    Full text link
    Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can be naturally implemented by neural networks (mainly with local learning rules and iterative dynamics based on information locally available to neurons). Scope. The survey is focused mainly on the networks of Hopfield, Willshaw and Potts, that have connections between pairs of neurons and operate on sparse binary vectors. We discuss not only autoassociative memory, but also the generalization properties of these networks. We also consider neural networks with higher-order connections and networks with a bipartite graph structure for non-binary data with linear constraints. Conclusions. In conclusion we discuss the relations to similarity search, advantages and drawbacks of these techniques, and topics for further research. An interesting and still not completely resolved question is whether neural autoassociative memories can search for approximate nearest neighbors faster than other index structures for similarity search, in particular for the case of very high dimensional vectors.Comment: 31 page

    Optimizing Associative Information Transfer within Content-addressable Memory

    Get PDF
    Original article can be found at: http://www.oldcitypublishing.com/IJUC/IJUC.htmlPeer reviewe

    Maximum Likelihood Associative Memories

    Full text link
    Associative memories are structures that store data in such a way that it can later be retrieved given only a part of its content -- a sort-of error/erasure-resilience property. They are used in applications ranging from caches and memory management in CPUs to database engines. In this work we study associative memories built on the maximum likelihood principle. We derive minimum residual error rates when the data stored comes from a uniform binary source. Second, we determine the minimum amount of memory required to store the same data. Finally, we bound the computational complexity for message retrieval. We then compare these bounds with two existing associative memory architectures: the celebrated Hopfield neural networks and a neural network architecture introduced more recently by Gripon and Berrou

    Challenges in interface and interaction design for context-aware augmented memory systems

    Get PDF
    The human long-term memory is astonishingly powerful but fallible at the same time. This makes it very easy to forget information one is sure one actually knows. We propose context-aware augmented memory systems as a solution to this problem. In this paper, we analyse the user interface and interaction design challenges that need to be overcome to build such a system. We hope for fruitful interdisciplinary discussions on how best to address these challenges

    Real time unsupervised learning of visual stimuli in neuromorphic VLSI systems

    Full text link
    Neuromorphic chips embody computational principles operating in the nervous system, into microelectronic devices. In this domain it is important to identify computational primitives that theory and experiments suggest as generic and reusable cognitive elements. One such element is provided by attractor dynamics in recurrent networks. Point attractors are equilibrium states of the dynamics (up to fluctuations), determined by the synaptic structure of the network; a `basin' of attraction comprises all initial states leading to a given attractor upon relaxation, hence making attractor dynamics suitable to implement robust associative memory. The initial network state is dictated by the stimulus, and relaxation to the attractor state implements the retrieval of the corresponding memorized prototypical pattern. In a previous work we demonstrated that a neuromorphic recurrent network of spiking neurons and suitably chosen, fixed synapses supports attractor dynamics. Here we focus on learning: activating on-chip synaptic plasticity and using a theory-driven strategy for choosing network parameters, we show that autonomous learning, following repeated presentation of simple visual stimuli, shapes a synaptic connectivity supporting stimulus-selective attractors. Associative memory develops on chip as the result of the coupled stimulus-driven neural activity and ensuing synaptic dynamics, with no artificial separation between learning and retrieval phases.Comment: submitted to Scientific Repor

    Semantic similarity dissociates shortfrom long-term recency effects: testing a neurocomputational model of list memory

    Get PDF
    The finding that recency effects can occur not only in immediate free recall (i.e., short-term recency) but also in the continuous-distractor task (i.e., long-term recency) has led many theorists to reject the distinction between short- and long-term memory stores. Recently, we have argued that long-term recency effects do not undermine the concept of a short-term store, and we have presented a neurocomputational model that accounts for both short- and long-term recency and for a series of dissociations between these two effects. Here, we present a new dissociation between short- and long-term recency based on semantic similarity, which is predicted by our model. This dissociation is due to the mutual support between associated items in the short-term store, which takes place in immediate free recall and delayed free recall but not in continuous-distractor free recall

    Neural Networks retrieving Boolean patterns in a sea of Gaussian ones

    Full text link
    Restricted Boltzmann Machines are key tools in Machine Learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient descent like methods, while Hopfield networks typically store binary patterns to be able to retrieve, the investigation of a mixed Hebbian network, equipped with both real (e.g., Gaussian) and discrete (e.g., Boolean) patterns naturally arises. We prove that, in the challenging regime of a high storage of real patterns, where retrieval is forbidden, an extra load of Boolean patterns can still be retrieved, as long as the ratio among the overall load and the network size does not exceed a critical threshold, that turns out to be the same of the standard Amit-Gutfreund-Sompolinsky theory. Assuming replica symmetry, we study the case of a low load of Boolean patterns combining the stochastic stability and Hamilton-Jacobi interpolating techniques. The result can be extended to the high load by a non rigorous but standard replica computation argument.Comment: 16 pages, 1 figur

    Quantum Pattern Retrieval by Qubit Networks with Hebb Interactions

    Get PDF
    Qubit networks with long-range interactions inspired by the Hebb rule can be used as quantum associative memories. Starting from a uniform superposition, the unitary evolution generated by these interactions drives the network through a quantum phase transition at a critical computation time, after which ferromagnetic order guarantees that a measurement retrieves the stored memory. The maximum memory capacity p of these qubit networks is reached at a memory density p/n=1.Comment: To appear in Physical Review Letter

    Trails of experiences: Navigating personal memories

    Get PDF
    Systems to augment personal information aim to support people in remembering both past experiences and specific information associated with past experiences. These types of information go beyond those supported in systems for Personal Information Management, making it necessary to develop new user interface and interaction techniques. Our approach is based on characteristics of human memory. Its major contribution is the combination of a graph-based data model with navigation mechanisms based on various types of context and on associations
    corecore