461 research outputs found

    Storage of Natural Language Sentences in a Hopfield Network

    Full text link
    This paper look at how the Hopfield neural network can be used to store and recall patterns constructed from natural language sentences. As a pattern recognition and storage tool, the Hopfield neural network has received much attention. This attention however has been mainly in the field of statistical physics due to the model's simple abstraction of spin glass systems. A discussion is made of the differences, shown as bias and correlation, between natural language sentence patterns and the randomly generated ones used in previous experiments. Results are given for numerical simulations which show the auto-associative competence of the network when trained with natural language patterns.Comment: latex, 10 pages with 2 tex figures and a .bib file, uses nemlap.sty, to appear in Proceedings of NeMLaP-

    Sequence Learning using Equilibrium Propagation

    Full text link
    Equilibrium Propagation (EP) is a powerful and more bio-plausible alternative to conventional learning frameworks such as backpropagation. The effectiveness of EP stems from the fact that it relies only on local computations and requires solely one kind of computational unit during both of its training phases, thereby enabling greater applicability in domains such as bio-inspired neuromorphic computing. The dynamics of the model in EP is governed by an energy function and the internal states of the model consequently converge to a steady state following the state transition rules defined by the same. However, by definition, EP requires the input to the model (a convergent RNN) to be static in both the phases of training. Thus it is not possible to design a model for sequence classification using EP with an LSTM or GRU like architecture. In this paper, we leverage recent developments in modern hopfield networks to further understand energy based models and develop solutions for complex sequence classification tasks using EP while satisfying its convergence criteria and maintaining its theoretical similarities with recurrent backpropagation. We explore the possibility of integrating modern hopfield networks as an attention mechanism with convergent RNN models used in EP, thereby extending its applicability for the first time on two different sequence classification tasks in natural language processing viz. sentiment analysis (IMDB dataset) and natural language inference (SNLI dataset)

    NASA JSC neural network survey results

    Get PDF
    A survey of Artificial Neural Systems in support of NASA's (Johnson Space Center) Automatic Perception for Mission Planning and Flight Control Research Program was conducted. Several of the world's leading researchers contributed papers containing their most recent results on artificial neural systems. These papers were broken into categories and descriptive accounts of the results make up a large part of this report. Also included is material on sources of information on artificial neural systems such as books, technical reports, software tools, etc

    A Peer-to-Peer Associative Memory Network for Intelligent Information Systems

    Get PDF
    The paper describes a highly-scalable associative memory network capable of handling multiple streams of input, which are processed and matched with the historical data (available within the network). The essence of the associative memory algorithm lies with in its highly parallel structure, which changes the emphasis from the high speed CPU based processing to network processing; capable of utilising a large number of low performance processors in a fully connected configuration. The approach is expected to facilitate the development of information systems capable of correlating multi-dimensional data inputs into human thought like constructs and thus exhibiting a level of self-awareness
    corecore