17 research outputs found
Strong Baselines for Simple Question Answering over Knowledge Graphs with and without Neural Networks
We examine the problem of question answering over knowledge graphs, focusing
on simple questions that can be answered by the lookup of a single fact.
Adopting a straightforward decomposition of the problem into entity detection,
entity linking, relation prediction, and evidence combination, we explore
simple yet strong baselines. On the popular SimpleQuestions dataset, we find
that basic LSTMs and GRUs plus a few heuristics yield accuracies that approach
the state of the art, and techniques that do not use neural networks also
perform reasonably well. These results show that gains from sophisticated deep
learning techniques proposed in the literature are quite modest and that some
previous models exhibit unnecessary complexity.Comment: Published in NAACL HLT 201
Improved Neural Relation Detection for Knowledge Base Question Answering
Relation detection is a core component for many NLP applications including
Knowledge Base Question Answering (KBQA). In this paper, we propose a
hierarchical recurrent neural network enhanced by residual learning that
detects KB relations given an input question. Our method uses deep residual
bidirectional LSTMs to compare questions and relation names via different
hierarchies of abstraction. Additionally, we propose a simple KBQA system that
integrates entity linking and our proposed relation detector to enable one
enhance another. Experimental results evidence that our approach achieves not
only outstanding relation detection performance, but more importantly, it helps
our KBQA system to achieve state-of-the-art accuracy for both single-relation
(SimpleQuestions) and multi-relation (WebQSP) QA benchmarks.Comment: Accepted by ACL 2017 (updated for camera-ready