3 research outputs found
KQA Pro: A Large-Scale Dataset with Interpretable Programs and Accurate SPARQLs for Complex Question Answering over Knowledge Base
Complex question answering over knowledge base (Complex KBQA) is challenging
because it requires various compositional reasoning capabilities, such as
multi-hop inference, attribute comparison, set operation, and etc. Existing
benchmarks have some shortcomings that limit the development of Complex KBQA:
1) they only provide QA pairs without explicit reasoning processes; 2)
questions are either generated by templates, leading to poor diversity, or on a
small scale. To this end, we introduce KQA Pro, a large-scale dataset for
Complex KBQA. We define a compositional and highly-interpretable formal format,
named Program, to represent the reasoning process of complex questions. We
propose compositional strategies to generate questions, corresponding SPARQLs,
and Programs with a small number of templates, and then paraphrase the
generated questions to natural language questions (NLQ) by crowdsourcing,
giving rise to around 120K diverse instances. SPARQL and Program depict two
complementary solutions to answer complex questions, which can benefit a large
spectrum of QA methods. Besides the QA task, KQA Pro can also serves for the
semantic parsing task. As far as we know, it is currently the largest corpus of
NLQ-to-SPARQL and NLQ-to-Program. We conduct extensive experiments to evaluate
whether machines can learn to answer our complex questions in different cases,
that is, with only QA supervision or with intermediate SPARQL/Program
supervision. We find that state-of-the-art KBQA methods learnt from only QA
pairs perform very poor on our dataset, implying our questions are more
challenging than previous datasets. However, pretrained models learnt from our
NLQ-to-SPARQL and NLQ-to-Program annotations surprisingly achieve about 90\%
answering accuracy, which is even close to the human expert performance..
Recommended from our members
Using domain specific language and sequence to sequence models as a hybrid framework for a natural language interface to a database solution
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe aim of this project is to provide a new approach to solving the problem of
converting natural language into a language capable of querying a database or data
repository. This problem has been around for a while, in the 1970's the US Navy
developed a solution called LADDER and since then there have been an array of
solutions, approaches and tweaks that have kept the research community busy. The
introduction of electronic assistants into the smart phone in 2010 has given new
impetus to this problem.
With the increasingly pervasive nature of data and its ever expanding use to answer
questions within business science, medicine extracting data is becoming more important.
The idea behind this project is to make data more democratised by allowing access to it
without the need for specialist languages. The performance and reliability of converting
natural language into structured query language can be problematic in handling nuances
that are prevalent in natural language. Relational databases are not designed to understand
language nuance.
This project introduces the following components as part of a holistic approach to improving
the conversion of a natural language statement into a language capable of querying a data
repository.
● The idea proposed in this project combines the use of sequence to sequence models
in conjunction with the natural language part of speech technologies and domain
specific languages to convert natural language queries into SQL. The approach
being proposed by this chapter is to use natural language processing to perform an
initial shallow pass of the incoming query and then use Google's Tensor Flow to
refine the query with the use of a sequence to sequence model.
● This thesis is also proposing to use a Domain Specific Language (DSL) as part of the
conversion process. The use of the DSL has the potential to allow the natural
language query to be translated into more than just an SQL statement, but any query
language such as NoSQL or XQuery