4 research outputs found

    KNEWS: Using Logical and Lexical Semantics to Extract Knowledge from Natural Language

    Get PDF
    International audienceWe present KNEWS, a pipeline of NLP tools that accepts natural language text as input and outputs knowledge in a machine-readable format. The tool outputs frame-based knowledge as RDF triples or XML, including the word-level alignment with the surface form, as well as first-order logical formulae. KNEWS is freely available for download. Moreover, thanks to its versatility, KNEWS has already been employed for a number of different applications for information extraction and automatic reasoning

    Improving Commonsense Causal Reasoning by Adversarial Training and Data Augmentation

    Full text link
    Determining the plausibility of causal relations between clauses is a commonsense reasoning task that requires complex inference ability. The general approach to this task is to train a large pretrained language model on a specific dataset. However, the available training data for the task is often scarce, which leads to instability of model training or reliance on the shallow features of the dataset. This paper presents a number of techniques for making models more robust in the domain of causal reasoning. Firstly, we perform adversarial training by generating perturbed inputs through synonym substitution. Secondly, based on a linguistic theory of discourse connectives, we perform data augmentation using a discourse parser for detecting causally linked clauses in large text, and a generative language model for generating distractors. Both methods boost model performance on the Choice of Plausible Alternatives (COPA) dataset, as well as on a Balanced COPA dataset, which is a modified version of the original data that has been developed to avoid superficial cues, leading to a more challenging benchmark. We show a statistically significant improvement in performance and robustness on both datasets, even with only a small number of additionally generated data points.Comment: 7 pages + pages references, 4 figures, 3 tables, paper accepted at AAAI202
    corecore