2 research outputs found
Context-Dependent Semantic Parsing over Temporally Structured Data
We describe a new semantic parsing setting that allows users to query the
system using both natural language questions and actions within a graphical
user interface. Multiple time series belonging to an entity of interest are
stored in a database and the user interacts with the system to obtain a better
understanding of the entity's state and behavior, entailing sequences of
actions and questions whose answers may depend on previous factual or
navigational interactions. We design an LSTM-based encoder-decoder architecture
that models context dependency through copying mechanisms and multiple levels
of attention over inputs and previous outputs. When trained to predict tokens
using supervised learning, the proposed architecture substantially outperforms
standard sequence generation baselines. Training the architecture using policy
gradient leads to further improvements in performance, reaching a
sequence-level accuracy of 88.7% on artificial data and 74.8% on real data.Comment: Accepted by NAACL 2019 (Oral presentation
Context Dependent Semantic Parsing: A Survey
Semantic parsing is the task of translating natural language utterances into
machine-readable meaning representations. Currently, most semantic parsing
methods are not able to utilize contextual information (e.g. dialogue and
comments history), which has a great potential to boost semantic parsing
performance. To address this issue, context dependent semantic parsing has
recently drawn a lot of attention. In this survey, we investigate progress on
the methods for the context dependent semantic parsing, together with the
current datasets and tasks. We then point out open problems and challenges for
future research in this area. The collected resources for this topic are
available
at:https://github.com/zhuang-li/Contextual-Semantic-Parsing-Paper-List.Comment: 10 pages, acceteped by COLING'202