999 research outputs found

    Incremental constraint-based parsing: an efficient approach for head-final languages

    Get PDF
    In this dissertation, I provide a left-to-right incremental parsing approach for Headdriven Phrase Structure Grammar (HPSG; Pollard and Sag (1987, 1994)). HPSG is a lexicalized, constraint-based theory of grammar, which has also been widely exploited in computational linguistics in recent years. Head-final languages are known to pose problems for the incrementality of head-driven parsing models, proposed for parsing with constraint-based grammar formalisms, in both psycholinguistics and computational linguistics. Therefore, here I further focus my attention on processing a head-final language, specifically Turkish, to highlight any challenges that may arise in the case of such a language. The dissertation makes two principal contributions, the first part mainly providing the theoretical treatment required for the computational approach presented in the second part. The first part of the dissertation is concerned with the analysis of certain phenomena in Turkish grammar within the frame..

    Discovery of Linguistic Relations Using Lexical Attraction

    Full text link
    This work has been motivated by two long term goals: to understand how humans learn language and to build programs that can understand language. Using a representation that makes the relevant features explicit is a prerequisite for successful learning and understanding. Therefore, I chose to represent relations between individual words explicitly in my model. Lexical attraction is defined as the likelihood of such relations. I introduce a new class of probabilistic language models named lexical attraction models which can represent long distance relations between words and I formalize this new class of models using information theory. Within the framework of lexical attraction, I developed an unsupervised language acquisition program that learns to identify linguistic relations in a given sentence. The only explicitly represented linguistic knowledge in the program is lexical attraction. There is no initial grammar or lexicon built in and the only input is raw text. Learning and processing are interdigitated. The processor uses the regularities detected by the learner to impose structure on the input. This structure enables the learner to detect higher level regularities. Using this bootstrapping procedure, the program was trained on 100 million words of Associated Press material and was able to achieve 60% precision and 50% recall in finding relations between content-words. Using knowledge of lexical attraction, the program can identify the correct relations in syntactically ambiguous sentences such as ``I saw the Statue of Liberty flying over New York.''Comment: dissertation, 56 page

    A derivational model of discontinuous parsing

    Get PDF
    The notion of latent-variable probabilistic context-free derivation of syntactic structures is enhanced to allow heads and unrestricted discontinuities. The chosen formalization covers both constituency parsing and dependency parsing. By the new framework, one obtains a probability distribution over the space of all discontinuous parses. This lends itself to intrinsic evaluation in terms of cross-entropy. The derivational model is accompanied by an equivalent automaton model, which can be used for deterministic parsing.PostprintPeer reviewe
    • …
    corecore