204 research outputs found

    A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations

    Full text link
    We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model's ability to selectively focus on the relevant parts of an input sequence.Comment: To appear at ACL2017, code available at https://github.com/sronnqvist/discourse-ablst

    A recurrent neural model with attention for the recognition of Chinese implicit discourse relations

    Get PDF
    We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model’s ability to selectively focus on the relevant parts of an input sequence

    Learning of text-level discourse parsing

    Get PDF
    Understanding the sense of discourse relations that appear between segments of text is essential to truly comprehend any natural language text. Several automated approaches have been suggested, but all rely on external resources, linguistic feature engineering, and their processing pipelines are built from substantially different models. Instead of designing a system specifically for a given language and task, we pursue a language-independent approach for sense classification of shallow discourse relations. In this dissertation we first present our focused recurrent neural networks (focused RNNs) layer, the first multi-dimensional RNN-attention mechanism for constructing sentence/argument embeddings. It consists of a filtering RNN with a filtering/gating mechanism that enables downstream RNNs to focus on different aspects of each argument of a discourse relation and project it into several embedding subspaces. On top of the proposed mechanism we build our FR system, a novel method for sense classification of shallow discourse relations. In contrast to existing systems, the FR system consists of a single end-to-end trainable model for handling all types and specific situations of discourse relations, requires no feature engineering or external resources, can be used almost out-of-the-box on any language or set of sense labels, and can be applied at the word and character level representation. We evaluate the proposed FR system using the official datasets and methodology of CoNLL 2016 Shared Task. It does not fall a lot behind state-of-the-art performance on English, but it outperforms other systems without a focused RNNs layer by 8% on the Chinese dataset. Afterwards we perform a detailed analysis on both languages

    Resource-lean modeling of coherence in commonsense stories

    Get PDF
    We present a resource-lean neural recognizer for modeling coherence in commonsense stories. Our lightweight system is inspired by successful attempts to modeling discourse relations and stands out due to its simplicity and easy optimization compared to prior approaches to narrative script learning. We evaluate our approach in the Story Cloze Test demonstrating an absolute improvement in accuracy of 4.7% over state-of-the-art implementations

    Review of coreference resolution in English and Persian

    Full text link
    Coreference resolution (CR) is one of the most challenging areas of natural language processing. This task seeks to identify all textual references to the same real-world entity. Research in this field is divided into coreference resolution and anaphora resolution. Due to its application in textual comprehension and its utility in other tasks such as information extraction systems, document summarization, and machine translation, this field has attracted considerable interest. Consequently, it has a significant effect on the quality of these systems. This article reviews the existing corpora and evaluation metrics in this field. Then, an overview of the coreference algorithms, from rule-based methods to the latest deep learning techniques, is provided. Finally, coreference resolution and pronoun resolution systems in Persian are investigated.Comment: 44 pages, 11 figures, 5 table
    • …
    corecore