660 research outputs found
A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding
Intent detection and slot filling are two main tasks for building a spoken
language understanding (SLU) system. The two tasks are closely tied and the
slots often highly depend on the intent. In this paper, we propose a novel
framework for SLU to better incorporate the intent information, which further
guides the slot filling. In our framework, we adopt a joint model with
Stack-Propagation which can directly use the intent information as input for
slot filling, thus to capture the intent semantic knowledge. In addition, to
further alleviate the error propagation, we perform the token-level intent
detection for the Stack-Propagation framework. Experiments on two publicly
datasets show that our model achieves the state-of-the-art performance and
outperforms other previous methods by a large margin. Finally, we use the
Bidirectional Encoder Representation from Transformer (BERT) model in our
framework, which further boost our performance in SLU task.Comment: Accepted at EMNLP 201
A Scope Sensitive and Result Attentive Model for Multi-Intent Spoken Language Understanding
Multi-Intent Spoken Language Understanding (SLU), a novel and more complex
scenario of SLU, is attracting increasing attention. Unlike traditional SLU,
each intent in this scenario has its specific scope. Semantic information
outside the scope even hinders the prediction, which tremendously increases the
difficulty of intent detection. More seriously, guiding slot filling with these
inaccurate intent labels suffers error propagation problems, resulting in
unsatisfied overall performance. To solve these challenges, in this paper, we
propose a novel Scope-Sensitive Result Attention Network (SSRAN) based on
Transformer, which contains a Scope Recognizer (SR) and a Result Attention
Network (RAN). Scope Recognizer assignments scope information to each token,
reducing the distraction of out-of-scope tokens. Result Attention Network
effectively utilizes the bidirectional interaction between results of slot
filling and intent detection, mitigating the error propagation problem.
Experiments on two public datasets indicate that our model significantly
improves SLU performance (5.4\% and 2.1\% on Overall accuracy) over the
state-of-the-art baseline
- …