13,456 research outputs found
Contextualized Non-local Neural Networks for Sequence Learning
Recently, a large number of neural mechanisms and models have been proposed
for sequence learning, of which self-attention, as exemplified by the
Transformer model, and graph neural networks (GNNs) have attracted much
attention. In this paper, we propose an approach that combines and draws on the
complementary strengths of these two methods. Specifically, we propose
contextualized non-local neural networks (CN), which can both
dynamically construct a task-specific structure of a sentence and leverage rich
local dependencies within a particular neighborhood.
Experimental results on ten NLP tasks in text classification, semantic
matching, and sequence labeling show that our proposed model outperforms
competitive baselines and discovers task-specific dependency structures, thus
providing better interpretability to users.Comment: Accepted by AAAI201
Ranking and Selecting Multi-Hop Knowledge Paths to Better Predict Human Needs
To make machines better understand sentiments, research needs to move from
polarity identification to understanding the reasons that underlie the
expression of sentiment. Categorizing the goals or needs of humans is one way
to explain the expression of sentiment in text. Humans are good at
understanding situations described in natural language and can easily connect
them to the character's psychological needs using commonsense knowledge. We
present a novel method to extract, rank, filter and select multi-hop relation
paths from a commonsense knowledge resource to interpret the expression of
sentiment in terms of their underlying human needs. We efficiently integrate
the acquired knowledge paths in a neural model that interfaces context
representations with knowledge using a gated attention mechanism. We assess the
model's performance on a recently published dataset for categorizing human
needs. Selectively integrating knowledge paths boosts performance and
establishes a new state-of-the-art. Our model offers interpretability through
the learned attention map over commonsense knowledge paths. Human evaluation
highlights the relevance of the encoded knowledge
A Novel Distributed Representation of News (DRNews) for Stock Market Predictions
In this study, a novel Distributed Representation of News (DRNews) model is
developed and applied in deep learning-based stock market predictions. With the
merit of integrating contextual information and cross-documental knowledge, the
DRNews model creates news vectors that describe both the semantic information
and potential linkages among news events through an attributed news network.
Two stock market prediction tasks, namely the short-term stock movement
prediction and stock crises early warning, are implemented in the framework of
the attention-based Long Short Term-Memory (LSTM) network. It is suggested that
DRNews substantially enhances the results of both tasks comparing with five
baselines of news embedding models. Further, the attention mechanism suggests
that short-term stock trend and stock market crises both receive influences
from daily news with the former demonstrates more critical responses on the
information related to the stock market {\em per se}, whilst the latter draws
more concerns on the banking sector and economic policies.Comment: 25 page
- …