440 research outputs found

    Linking flat predicate argument structures

    Get PDF
    This report presents an approach to enriching flat and robust predicate argument structures with more fine-grained semantic information, extracted from underspecified semantic representations and encoded in Minimal Recursion Semantics (MRS). Such representations are provided by a hand-built HPSG grammar with a wide linguistic coverage. A specific semantic representation, called linked predicate argument structure (LPAS), has been worked out, which describes the explicit embedding relationships among predicate argument structures. LPAS can be used as a generic interface language for integrating semantic representations with different granularities. Some initial experiments have been conducted to convert MRS expressions into LPASs. A simple constraint solver is developed to resolve the underspecified dominance relations between the predicates and their arguments in MRS expressions. LPASs are useful for high-precision information extraction and question answering tasks because of their fine-grained semantic structures. In addition, I have attempted to extend the lexicon of the HPSG English Resource Grammar (ERG) exploiting WordNet and to disambiguate the readings of HPSG parsing with the help of a probabilistic parser, in order to process texts from application domains. Following the presented approach, the HPSG ERG grammar can be used for annotating some standard treebank, e.g., the Penn Treebank, with its fine-grained semantics. In this vein, I point out opportunities for a fruitful cooperation of the HPSG annotated Redwood Treebank and the Penn PropBank. In my current work, I exploit HPSG as an additional knowledge resource for the automatic learning of LPASs from dependency structures

    Licensing Constraint of Negative Polarity Items in English

    Get PDF
    The purpose of this paper is to investigate on NPI-licensing constraint focusing on English interrogative sentences and clarify their behavior within the formalism of HPSG. The Polarity Operator in questions and imperatives will be examined to clarify their syntactic and semantic properties. In this paper, Tonhauser (2001), most of which I basically agree to, will be modified to have a stronger power incorporating all relevant NPI-licensing phenomena into one single principle - (non)veridicality.Supported by the Grant for the Reform of University Education under the BK21 Project of SNU

    Children always go beyond the input: The Maximise Minimal Means perspective

    Get PDF

    Syntactic and Semantic Underspecification in the Verb Phrase.

    Get PDF
    This thesis is concerned with verbs and the relation between verbs and their complements. Syntactic evidence is presented which shows that the distinction between arguments and adjuncts reflects the optionality of adjuncts, but that adjuncts, once introduced, behave as arguments of the verb. An analysis is proposed which reflects this observation by assuming that verbal subcategorization is underspecified, so that optional constituents can be introduced into the verb phrase. The analysis is developed within a formal model of utterance interpretation. Labelled Deductive Systems for Natural Language (LDSNL), proposed in Kempson, Meyer-Viol & Gabbay (1999), which models the structural aspect of utterance interpretation as a dynamic process of tree growth during which lexical information is combined into more complex structures which provide vehicles for interpretation, propositional forms. The contribution of this thesis from the perspective of utterance interpretation is that it explores the notion of structural underspecification with respect to predicate-argument structure. After providing a formalization of underspecified verbal subcategorization, the thesis explores the consequences this analysis of verbs and verb phrases has for the process of tree growth, and how underspecified verbs are interpreted. The main argument developed is that verbs syntactically encode the possibilty for pragmatic enrichment; verbs address mental concepts only indirectly, so that the establishment of their eventual meaning, and, therefore, their eventual arity is mediated by the cognitive process of concept formation. Additional support for this view is provided by an analysis of applied verbs in Swahili which, from the perspective adopted here, can be seen to encode an explicit instruction for concept strengthening, an instruction to the hearer to derive additional inferential effects. The analysis presented in this thesis thus supports the view that natural language interpretation is a process in which structural properties and inferential activity are thoroughly intertwined

    Modeling information structure in a cross-linguistic perspective

    Get PDF
    This study makes substantial contributions to both the theoretical and computational treatment of information structure, with a specific focus on creating natural language processing applications such as multilingual machine translation systems. The present study first provides cross-linguistic findings in regards to information structure meanings and markings. Building upon such findings, the current model represents information structure within the HPSG/MRS framework using Individual Constraints. The primary goal of the present study is to create a multilingual grammar model of information structure for the LinGO Grammar Matrix system. The present study explores the construction of a grammar library for creating customized grammar incorporating information structure and illustrates how the information structure-based model improves performance of transfer-based machine translation

    Semantics-based Question Generation and Implementation

    Get PDF
    This paper presents a question generation system based on the approach of semantic rewriting. The state-of-the-art deep linguistic parsing and generation tools are employed to convert (back and forth) between the natural language sentences and their meaning representations in the form of Minimal Recursion Semantics (MRS). By carefully operating on the semantic structures, we show a principled way of generating questions without ad-hoc manipulation of the syntactic structures. Based on the (partial) understanding of the sentence meaning, the system generates questions which are semantically grounded and purposeful. And with the support of deep linguistic grammars, the grammaticality of the generation results is warranted. Further, with a specialized ranking model, the linguistic realizations from the general purpose generation model are further refined for our the question generation task. The evaluation results from QGSTEC2010 show promising prospects of the proposed approach

    Factors 2 and 3: Towards a principled approach

    Get PDF
    This paper seeks to make progress in our understanding of the non-UG components of Chomsky's (2005) Three Factors model. In relation to the input (Factor 2), I argue for the need to formulate a suitably precise hypothesis about which aspects of the input will qualify as 'intake' and, hence, serve as the basis for grammar construction. In relation to Factor 3, I highlight a specific cognitive bias that appears well motivated outside of language, while also having wide-ranging consequences for our understanding of how I-language grammars are constructed, and why they should have the crosslinguistically comparable form that generativists have always argued human languages have. This is Maximise Minimal Means (MMM). I demonstrate how its incorporation into our model of grammar acquisition facilitates understanding of diverse facts about natural language typology, acquisition, both in "stable" and "unstable" contexts, and also the ways in which linguistic systems may change over time.Aquest treball pretén fer progressos en la comprensió dels components que no són UG del model de tres factors de Chomsky (2005). En relació amb l'entrada (factor 2), argumento la necessitat de formular una hipòtesi adequada i precisa sobre quins aspectes de l'entrada es qualificaran com a "ingesta" i, per tant, seran la base de la construcció gramatical. En relació amb el factor 3, destaco un biaix cognitiu específic que apareix força motivat fora del llenguatge, alhora que té àmplies conseqüències per a la nostra comprensió de com es construeixen les gramàtiques del llenguatge I, i per què haurien de tenir la forma interlingüísticament comparable als generativistes. Es tracta de maximitzar els mitjans mínims (MMM). Demostro que la seva incorporació al nostre model d'adquisició gramatical facilita la comprensió de fets diversos sobre tipologia de llenguatge natural, adquisició, tant en contextos "estables" com "inestables", i també de les maneres de canviar els sistemes lingüístics amb el pas del temps

    A compositional and constraint-based approach to non-sentential utterances

    Get PDF
    Schlangen D, Lascarides A. A compositional and constraint-based approach to non-sentential utterances. In: Müller S, ed. Proceedings of the 10th international conference on Head-Driven Phrase Structure Grammar. East Lansing, Michigan, USA: CSLI Publications, Stanford, USA; 2003: 123-124

    Meaning versus Grammar

    Get PDF
    This volume investigates the complicated relationship between grammar, computation, and meaning in natural languages. It details conditions under which meaning-driven processing of natural language is feasible, discusses an operational and accessible implementation of the grammatical cycle for Dutch, and offers analyses of a number of further conjectures about constituency and entailment in natural language
    corecore