70 research outputs found

    Underspecified Universal Dependency Structures as Inputs for Multilingual Surface Realisation

    Get PDF
    In this paper, we present the datasets used in the Shallow and Deep Tracks of the First Multilingual Surface Realisation Shared Task (SR’18). For the Shallow Track, data in ten languages has been released: Arabic, Czech, Dutch, English, Finnish, French, Italian, Portuguese, Russian and Spanish. For the Deep Track, data in three languages is made available: English, French and Spanish. We describe in detail how the datasets were derived from the Universal Dependencies V2.0, and report on an evaluation of the Deep Track input quality. In addition, we examine the motivation for, and likely usefulness of, deriving NLG inputs from annotations in resources originally developed for Natural Language Understanding (NLU), and assess whether the resulting inputs supply enough information of the right kind for the final stage in the NLG process

    Multilingual Surface Realization Using Universal Dependency Trees

    Get PDF
    We propose a shared task on multilingual SurfaceRealization, i.e., on mapping unorderedand uninflected universal dependency trees tocorrectly ordered and inflected sentences in anumber of languages. A second deeper inputwill be available in which, in addition,functional words, fine-grained PoS and morphologicalinformation will be removed fromthe input trees. The first shared task on SurfaceRealization was carried out in 2011 witha similar setup, with a focus on English. Wethink that it is time for relaunching such ashared task effort in view of the arrival of UniversalDependencies annotated treebanks fora large number of languages on the one hand,and the increasing dominance of Deep Learning,which proved to be a game changer forNLP, on the other hand

    Building reliable surface realization systems with sentence plans

    Get PDF
    Neural network-based language models have been shown to generate remarkably fluent and human-like text. Our goal is to incorporate these language models into real life applications, such as surface realization in task-oriented dialogue systems. However these language models cannot be trusted to produce outputs with 100% accuracy. Even in the best case scenario | with large datasets, on relatively simple tasks | neural network-based language models communicate incorrect information in 5% - 10% of cases. Therefore, our research focuses on how to guarantee accurate output. We present experiments and analysis on the use of sentence plans, which we believe are key to improving the performance of neural network-based language models on surface realization tasks. These insights are a key contribution towards the development of more reliable surface realization systems in task-oriented dialogue

    MindSpaces:Art-driven Adaptive Outdoors and Indoors Design

    Get PDF
    MindSpaces provides solutions for creating functionally and emotionally appealing architectural designs in urban spaces. Social media services, physiological sensing devices and video cameras provide data from sensing environments. State-of-the-Art technology including VR, 3D design tools, emotion extraction, visual behaviour analysis, and textual analysis will be incorporated in MindSpaces platform for analysing data and adapting the design of spaces.</p

    Approximate text generation from non-hierarchical representations in a declarative framework

    Get PDF
    This thesis is on Natural Language Generation. It describes a linguistic realisation system that translates the semantic information encoded in a conceptual graph into an English language sentence. The use of a non-hierarchically structured semantic representation (conceptual graphs) and an approximate matching between semantic structures allows us to investigate a more general version of the sentence generation problem where one is not pre-committed to a choice of the syntactically prominent elements in the initial semantics. We show clearly how the semantic structure is declaratively related to linguistically motivated syntactic representation — we use D-Tree Grammars which stem from work on Tree-Adjoining Grammars. The declarative specification of the mapping between semantics and syntax allows for different processing strategies to be exploited. A number of generation strategies have been considered: a pure topdown strategy and a chart-based generation technique which allows partially successful computations to be reused in other branches of the search space. Having a generator with increased paraphrasing power as a consequence of using non-hierarchical input and approximate matching raises the issue whether certain 'better' paraphrases can be generated before others. We investigate preference-based processing in the context of generation

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)

    Head-Driven Phrase Structure Grammar

    Get PDF
    Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism)
    • 

    corecore