5 research outputs found

    Parsing Strategies With \u27Lexicalized\u27 Grammars: Application to Tree Adjoining Grammars

    Get PDF
    In this paper, we present a parsing strategy that arose from the development of an Earley-type parsing algorithm for TAGs (Schabes and Joshi 1988) and from some recent linguistic work in TAGs (Abeillé: 1988a). In our approach, each elementary structure is systematically associated with a lexical head. These structures specify extended domains of locality (as compared to a context-free grammar) over which constraints can be stated. These constraints either hold within the elementary structure itself or specify what other structures can be composed with a given elementary structure. The \u27grammar\u27 consists of a lexicon where each lexical item is associated with a finite number of structures for which that item is the head. There are no separate grammar rules. There are, of course, \u27rules\u27 which tell us how these structures are composed. A grammar of this form will be said to be \u27lexicalized\u27. We show that in general context-free grammars cannot be \u27lexicalized\u27. We then show how a \u27lexicalized\u27 grammar naturally follows from the extended domain of locality of TAGs and examine briefly some of the linguistic implications of our approach. A general parsing strategy for \u27lexicalized\u27 grammars is discussed. In the first stage, the parser selects a set of elementary structures associated with the lexical items in the input sentence, and in the second stage the sentence is parsed with respect to this set. The strategy is independent of nature of the elementary structures in the underlying grammar. However, we focus our attention on TAGs. Since the set of trees selected at the end of the first stage is not infinite, the parser can use in principle any search strategy. Thus, in particular, a top-down strategy can be used since problems due to recursive structures are eliminated. We then explain how the Earley-type parser for TAGs can be modified to take advantage of this approach

    Type-driven natural language analysis

    Get PDF
    The purpose of this thesis is in showing how recent developments in logic programming can be exploited to encode in a computational environment the features of certain linguistic theories. We are in this way able to make available for the purpose of natural language processing sophisticated capabilities of linguistic analysis directly justified by well developed grammatical frameworks. More specifically, we exploit hypothetical reasoning, recently proposed as one of the possible directions to widen logic programming, to account for the syntax of filler-gap dependencies along the lines of linguistic theories such as Generalized Phrase Structure Grammar and Categorial Grammar. Moreover, we make use, for the purpose of semantic analysis of the same kind of phenomena, of another recently proposed extension, interestingly related to the previous one, namely the idea of replacing first-order terms with the more expressive λ-terms of λ-Calculus

    A definite clause version of Categorial Grammar

    No full text

    Proof nets for linguistic analysis

    Get PDF
    This book investigates the possible linguistic applications of proof nets, redundancy free representations of proofs, which were introduced by Girard for linear logic. We will adapt the notion of proof net to allow the formulation of a proof net calculus which is soundand complete for the multimodal Lambek calculus. Finally, we will investigate the computational and complexity theoretic consequences of this calculus and give an introduction to a practical grammar development tool based on proof nets
    corecore