1 research outputs found
Enhancing Semantic Word Representations by Embedding Deeper Word Relationships
Word representations are created using analogy context-based statistics and
lexical relations on words. Word representations are inputs for the learning
models in Natural Language Understanding (NLU) tasks. However, to understand
language, knowing only the context is not sufficient. Reading between the lines
is a key component of NLU. Embedding deeper word relationships which are not
represented in the context enhances the word representation. This paper
presents a word embedding which combines an analogy, context-based statistics
using Word2Vec, and deeper word relationships using Conceptnet, to create an
expanded word representation. In order to fine-tune the word representation,
Self-Organizing Map is used to optimize it. The proposed word representation is
compared with semantic word representations using Simlex 999. Furthermore, the
use of 3D visual representations has shown to be capable of representing the
similarity and association between words. The proposed word representation
shows a Spearman correlation score of 0.886 and provided the best results when
compared to the current state-of-the-art methods, and exceed the human
performance of 0.78.Comment: Accepted for the International Conference on Computer and Automation
Engineering (ICCAE) 201