1 research outputs found

    Recursive Neural Networks for Semantic Sentence Representation

    Full text link
    Semantic representation has a rich history rife with both complex linguistic theory and computational models. Though this history stretches back almost 50 years (Salton, 1971), recently the field has undergone an unexpected shift in paradigm thanks to the work of Mikolov et al., 2013(a & b) which has proven that vector-space semantic models can capture large amounts of semantic information. As of yet, these semantic representations are computed at the word level, and finding a semantic representation of a phrase is a much more difficult challenge. Mikolov et al., 2013(a&b) proved that their word vectors can be composed arithmetically to achieve reasonable representations of phrases, but this ignores syntactic information due to the commutativity of the arithmetic composition functions (addition, multiplication, etc.), causing the representation for the phrase “man bites dog” and “dog bites man” to be identical. This work hopes to introduce a way of computing word level semantic representations alongside a parse tree based approach to composing those word vectors to achieve a joint word-phrase semantic vector space. All associated code for this thesis was written in Python and can be found at https://github.com/liamge/Pytorch_ReNN
    corecore