1,158 research outputs found
Distilling Word Embeddings: An Encoding Approach
Distilling knowledge from a well-trained cumbersome network to a small one
has recently become a new research topic, as lightweight neural networks with
high performance are particularly in need in various resource-restricted
systems. This paper addresses the problem of distilling word embeddings for NLP
tasks. We propose an encoding approach to distill task-specific knowledge from
a set of high-dimensional embeddings, which can reduce model complexity by a
large margin as well as retain high accuracy, showing a good compromise between
efficiency and performance. Experiments in two tasks reveal the phenomenon that
distilling knowledge from cumbersome embeddings is better than directly
training neural networks with small embeddings.Comment: Accepted by CIKM-16 as a short paper, and by the Representation
Learning for Natural Language Processing (RL4NLP) Workshop @ACL-16 for
presentatio
Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Path
Relation classification is an important research arena in the field of
natural language processing (NLP). In this paper, we present SDP-LSTM, a novel
neural network to classify the relation of two entities in a sentence. Our
neural architecture leverages the shortest dependency path (SDP) between two
entities; multichannel recurrent neural networks, with long short term memory
(LSTM) units, pick up heterogeneous information along the SDP. Our proposed
model has several distinct features: (1) The shortest dependency paths retain
most relevant information (to relation classification), while eliminating
irrelevant words in the sentence. (2) The multichannel LSTM networks allow
effective information integration from heterogeneous sources over the
dependency paths. (3) A customized dropout strategy regularizes the neural
network to alleviate overfitting. We test our model on the SemEval 2010
relation classification task, and achieve an -score of 83.7\%, higher than
competing methods in the literature.Comment: EMNLP '1
- …