23,746 research outputs found

    Effect of connectivity in an associative memory model

    Get PDF
    AbstractWe investigate how geometric properties translate into functional properties in sparse networks of computing elements. Specifically, we determine how the eigenvalues of the interconnection graph (which in turn reflect connectivity properties) relate to the quantities, number of items stored, amount of error-correction, radius of attraction, and rate of convergence, in an associative memory model consisting of a sparse network of threshold elements or neurons

    An associative network with spatially organized connectivity

    Full text link
    We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of activity with large overlaps with one of the stored patterns. It is also shown, with simulations and analytic results, that the storage capacity does not decrease much when the connectivity of the network becomes short range. In addition, the method used here enables us to calculate exactly the storage capacity of a randomly connected network with arbitrary degree of dilution.Comment: 27 pages, 6 figures; Accepted for publication in JSTA

    High Performance Associative Memories and Structured Weight Dilution

    Get PDF
    Copyright SpringerThe consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable level of performance at fairly high dilution rates

    Associative memory on a small-world neural network

    Full text link
    We study a model of associative memory based on a neural network with small-world structure. The efficacy of the network to retrieve one of the stored patterns exhibits a phase transition at a finite value of the disorder. The more ordered networks are unable to recover the patterns, and are always attracted to mixture states. Besides, for a range of the number of stored patterns, the efficacy has a maximum at an intermediate value of the disorder. We also give a statistical characterization of the attractors for all values of the disorder of the network.Comment: 5 pages, 4 figures (eps

    Connection Strategies in Associative Memory Models

    Get PDF
    “The original publication is available at www.springerlink.com”. Copyright Springer.The problem we address in this paper is that of finding effective and parsimonious patterns of connectivity in sparse associative memories. This problem must be addressed in real neuronal systems, so results in artificial systems could throw light on real systems. We show that there are efficient patterns of connectivity and that these patterns are effective in models with either spiking or non-spiking neurons. This suggests that there may be some underlying general principles governing good connectivity in such networks.Peer reviewe

    Extracting Spooky-activation-at-a-distance from Considerations of Entanglement

    Get PDF
    Following an early claim by Nelson & McEvoy \cite{Nelson:McEvoy:2007} suggesting that word associations can display `spooky action at a distance behaviour', a serious investigation of the potentially quantum nature of such associations is currently underway. This paper presents a simple quantum model of a word association system. It is shown that a quantum model of word entanglement can recover aspects of both the Spreading Activation equation and the Spooky-activation-at-a-distance equation, both of which are used to model the activation level of words in human memory.Comment: 13 pages, 2 figures; To appear in Proceedings of the Third Quantum Interaction Symposium, Lecture Notes in Artificial Intelligence, vol 5494, Springer, 200
    corecore