168 research outputs found

    Disambiguation of Super Parts of Speech (or Supertags): Almost Parsing

    Get PDF
    In a lexicalized grammar formalism such as Lexicalized Tree-Adjoining Grammar (LTAG), each lexical item is associated with at least one elementary structure (supertag) that localizes syntactic and semantic dependencies. Thus a parser for a lexicalized grammar must search a large set of supertags to choose the right ones to combine for the parse of the sentence. We present techniques for disambiguating supertags using local information such as lexical preference and local lexical dependencies. The similarity between LTAG and Dependency grammars is exploited in the dependency model of supertag disambiguation. The performance results for various models of supertag disambiguation such as unigram, trigram and dependency-based models are presented.Comment: ps file. 8 page

    A syntactified direct translation model with linear-time decoding

    Get PDF
    Recent syntactic extensions of statistical translation models work with a synchronous context-free or tree-substitution grammar extracted from an automatically parsed parallel corpus. The decoders accompanying these extensions typically exceed quadratic time complexity. This paper extends the Direct Translation Model 2 (DTM2) with syntax while maintaining linear-time decoding. We employ a linear-time parsing algorithm based on an eager, incremental interpretation of Combinatory Categorial Grammar (CCG). As every input word is processed, the local parsing decisions resolve ambiguity eagerly, by selecting a single supertag–operator pair for extending the dependency parse incrementally. Alongside translation features extracted from the derived parse tree, we explore syntactic features extracted from the incremental derivation process. Our empirical experiments show that our model significantly outperforms the state-of-the art DTM2 system

    Anaphora and Discourse Structure

    Full text link
    We argue in this paper that many common adverbial phrases generally taken to signal a discourse relation between syntactically connected units within discourse structure, instead work anaphorically to contribute relational meaning, with only indirect dependence on discourse structure. This allows a simpler discourse structure to provide scaffolding for compositional semantics, and reveals multiple ways in which the relational meaning conveyed by adverbial connectives can interact with that associated with discourse structure. We conclude by sketching out a lexicalised grammar for discourse that facilitates discourse interpretation as a product of compositional rules, anaphor resolution and inference.Comment: 45 pages, 17 figures. Revised resubmission to Computational Linguistic

    Rethinking Overspecification in Terms of Incremental Processing

    Get PDF

    Completions, Coordination, and Alignment in Dialogue

    Get PDF
    Collaborative completions are among the strongest evidence that dialogue requires coordination even at the sub-sentential level; the study of sentence completions may thus shed light on a number of central issues both at the `macro’ level of dialogue management and at the `micro’ level of the semantic interpretation of utterances. We propose a treatment of collaborative completions in PTT, a theory of interpretation in dialogue that provides some of the necessary ingredients for a formal account of completions at the ‘micro’ level, such a theory of incremental utterance interpretation and an account of grounding. We argue that an account of semantic interpretation in completions can be provided through relatively straightforward generalizations of existing theories of syntax such as Lexical Tree Adjoining Grammar (LTAG) and of semantics such as (Compositional) DRT and SituationSemantics. At the macro level, we provide an intentional account of completions, as well as a preliminary account within Pickering and Garrod’s alignment theory

    Strong connectivity hypothesis and generative power in TAG

    Get PDF
    corecore