914 research outputs found

    On the Storage Capacity of the Hopfield Model

    Get PDF
    We give a review on the rigorous results concerning the storage capacity of the Hopfield model. We distinguish between two different concepts of storage both of them guided by the idea that the retrieval dynamics is a Monte-Carlo dynamics (possibly at zero temperature). We recall the results of McEliece et al. [MPRV87] as well as those by Newman [N88] for the storage capacity of the Hopfield model with unbiased i.i.d. patterns and comprehend some recent development concerning the Hopfield model with semantically correlated or biased patterns

    On the Storage Capacity of the Hopfield Model

    Get PDF
    We give a review on the rigorous results concerning the storage capacity of the Hopfield model. We distinguish between two different concepts of storage both of them guided by the idea that the retrieval dynamics is a Monte-Carlo dynamics (possibly at zero temperature). We recall the results of McEliece et al. [MPRV87] as well as those by Newman [N88] for the storage capacity of the Hopfield model with unbiased i.i.d. patterns and comprehend some recent development concerning the Hopfield model with semantically correlated or biased patterns

    Storage of Natural Language Sentences in a Hopfield Network

    Full text link
    This paper look at how the Hopfield neural network can be used to store and recall patterns constructed from natural language sentences. As a pattern recognition and storage tool, the Hopfield neural network has received much attention. This attention however has been mainly in the field of statistical physics due to the model's simple abstraction of spin glass systems. A discussion is made of the differences, shown as bias and correlation, between natural language sentence patterns and the randomly generated ones used in previous experiments. Results are given for numerical simulations which show the auto-associative competence of the network when trained with natural language patterns.Comment: latex, 10 pages with 2 tex figures and a .bib file, uses nemlap.sty, to appear in Proceedings of NeMLaP-

    The capacity and attractor basins of associative memory models

    Get PDF
    The original publication is available at www.springerlink.com . Copyright SpringerThe performance characteristics of five variants of the Hopfield network are examined. Two performance metrics are used: memory capacity, and a measure of the size of basins of attraction. We find that the posttraining adjustment of processor thresholds has, at best, little or no effect on performance, and at worst a significant negative effect. The adoption of a local learning rule can, however, give rise to significant performance gains.Peer reviewe

    A Comparative Study of Sparse Associative Memories

    Full text link
    We study various models of associative memories with sparse information, i.e. a pattern to be stored is a random string of 00s and 11s with about log⁥N\log N 11s, only. We compare different synaptic weights, architectures and retrieval mechanisms to shed light on the influence of the various parameters on the storage capacity.Comment: 28 pages, 2 figure

    Mapping correlated Gaussian patterns in a perceptron

    Get PDF
    The authors study the performance of a single-layer perceptron in realising a binary mapping of Gaussian input patterns. By introducing non-trivial correlations among the patterns, they generate a family of mappings including easier ones where similar inputs are mapped into the same output, and more difficult ones where similar inputs are mapped into different classes. The difficulty of the problem is gauged by the storage capacity of the network, which is higher for the easier problems
    • 

    corecore