3 research outputs found

    Hebbian learning in recurrent neural networks for natural language processing

    Get PDF
    This research project examines Hebbian learning in recurrent neural networks for natural language processing and attempts to interpret language at the level of a two and one half year old child. In this project five neural networks were built to interpret natural language: a Simple Recurrent Network with Hebbian Learning, a Jordan network with Hebbian learning and one hidden layer, a Jordannetwork with Hebbian learning and no hidden layers, a Simple Recurrent Network back propagation learning, and a nonrecurrent neural network with backpropagation learning. It is known that Hebbian learning works well when the input vectors are orthogonal, but, as this project shows, it does not perform well in recurrent neural networks for natural language processing when the input vectors for the individual words are approximately orthogonal. This project shows that,given approximately orthogonal vectors to represent each word in the vocabulary the input vectors for a given command are not approximately orthogonal and the internal representations that the neural network builds are similar for different commands. As the data shows, the Hebbian learning neural networks were unable to perform the natural language interpretation task while the back propagation neural networks were much more successful. Therefore, Hebbian learning does not work well in recurrent neural networks for natural language processing even when the input vectors for the individual words are approximately orthogonal

    Exploring the adaptive structure of the mental lexicon

    Get PDF
    The mental lexicon is a complex structure organised in terms of phonology, semantics and syntax, among other levels. In this thesis I propose that this structure can be explained in terms of the pressures acting on it: every aspect of the organisation of the lexicon is an adaptation ultimately related to the function of language as a tool for human communication, or to the fact that language has to be learned by subsequent generations of people. A collection of methods, most of which are applied to a Spanish speech corpus, reveal structure at different levels of the lexicon.• The patterns of intra-word distribution of phonological information may be a consequence of pressures for optimal representation of the lexicon in the brain, and of the pressure to facilitate speech segmentation.• An analysis of perceived phonological similarity between words shows that the sharing of different aspects of phonological similarity is related to different functions. Phonological similarity perception sometimes relates to morphology (the stressed final vowel determines verb tense and person) and at other times shows processing biases (similarity in the word initial and final segments is more readily perceived than in word-internal segments).• Another similarity analysis focuses on cooccurrence in speech to create a representation of the lexicon where the position of a word is determined by the words that tend to occur in its close vicinity. Variations of context-based lexical space naturally categorise words syntactically and semantically.• A higher level of lexicon structure is revealed by examining the relationships between the phonological and the cooccurrence similarity spaces. A study in Spanish supports the universality of the small but significant correlation between these two spaces found in English by Shillcock, Kirby, McDonald and Brew (2001). This systematicity across levels of representation adds an extra layer of structure that may help lexical acquisition and recognition. I apply it to a new paradigm to determine the function of parameters of phonological similarity based on their relationships with the syntacticsemantic level. I find that while some aspects of a language's phonology maintain systematicity, others work against it, perhaps responding to the opposed pressure for word identification.This thesis is an exploratory approach to the study of the mental lexicon structure that uses existing and new methodology to deepen our understanding of the relationships between language use and language structure
    corecore