9,883 research outputs found
Learning text representation using recurrent convolutional neural network with highway layers
Recently, the rapid development of word embedding and neural networks has
brought new inspiration to various NLP and IR tasks. In this paper, we describe
a staged hybrid model combining Recurrent Convolutional Neural Networks (RCNN)
with highway layers. The highway network module is incorporated in the middle
takes the output of the bi-directional Recurrent Neural Network (Bi-RNN) module
in the first stage and provides the Convolutional Neural Network (CNN) module
in the last stage with the input. The experiment shows that our model
outperforms common neural network models (CNN, RNN, Bi-RNN) on a sentiment
analysis task. Besides, the analysis of how sequence length influences the RCNN
with highway layers shows that our model could learn good representation for
the long text.Comment: Neu-IR '16 SIGIR Workshop on Neural Information Retrieva
- …