8 research outputs found

    Proceedings of the Fifth Meeting on Mathematics of Language : MOL5

    Get PDF

    Proceedings of the Fifth Meeting on Mathematics of Language : MOL5

    Get PDF

    Extraction and Coordination in Phrase Structure Grammar and Categorial Grammar

    Get PDF
    A large proportion of computationally-oriented theories of grammar operate within the confines of monostratality (i.e. there is only one level of syntactic analysis), compositionality (i.e. the meaning of an expression is determined by the meanings of its syntactic parts, plus their manner of combination), and adjacency (i.e. the only operation on terminal strings is concatenation). This thesis looks at two major approaches falling within these bounds: that based on phrase structure grammar (e.g. Gazdar), and that based on categorial grammar (e.g. Steedman). The theories are examined with reference to extraction and coordination constructions; crucially a range of 'compound' extraction and coordination phenomena are brought to bear. It is argued that the early phrase structure grammar metarules can characterise operations generating compound phenomena, but in so doing require a categorial-like category system. It is also argued that while categorial grammar contains an adequate category apparatus, Steedman's primitives such as composition do not extend to cover the full range of data. A theory is therefore presented integrating the approaches of Gazdar and Steedman. The central issue as regards processing is derivational equivalence: the grammars under consideration typically generate many semantically equivalent derivations of an expression. This problem is addressed by showing how to axiomatise derivational equivalence, and a parser is presented which employs the axiomatisation to avoid following equivalent paths

    Learning Functional Prepositions

    Full text link
    In first language acquisition, what does it mean for a grammatical category to have been acquired, and what are the mechanisms by which children learn functional categories in general? In the context of prepositions (Ps), if the lexical/functional divide cuts through the P category, as has been suggested in the theoretical literature, then constructivist accounts of language acquisition would predict that children develop adult-like competence with the more abstract units, functional Ps, at a slower rate compared to their acquisition of lexical Ps. Nativists instead assume that the features of functional P are made available by Universal Grammar (UG), and are mapped as quickly, if not faster, than the semantic features of their lexical counterparts. Conversely, if Ps are either all lexical or all functional, on both accounts of acquisition we should observe few differences in learning. Three empirical studies of the development of P were conducted via computer analysis of the English and Spanish sub-corpora of the CHILDES database. Study 1 analyzed errors in child usage of Ps, finding almost no errors in commission in either language, but that the English learners lag in their production of functional Ps relative to lexical Ps. That no such delay was found in the Spanish data suggests that the English pattern is not universal. Studies 2 and 3 applied novel measures of phrasal (P head + nominal complement) productivity to the data. Study 2 examined prepositional phrases (PPs) whose head-complement pairs appeared in both child and adult speech, while Study 3 considered PPs produced by children that never occurred in adult speech. In both studies the productivity of Ps for English children developed faster than that of lexical Ps. In Spanish there were few differences, suggesting that children had already mastered both orders of Ps early in acquisition. These empirical results suggest that at least in English P is indeed a split category, and that children acquire the syntax of the functional subset very quickly, committing almost no errors. The UG position is thus supported. Next, the dissertation investigates a \u27soft nativist\u27 acquisition strategy that composes the distributional analysis of input, minimal a priori knowledge of the possible co-occurrence of morphosyntactic features associated with functional elements, and linguistic knowledge that is presumably acquired via the experience of pragmatic, communicative situations. The output of the analysis consists in a mapping of morphemes to the feature bundles of nominative pronouns for English and Spanish, plus specific claims about the sort of knowledge required from experience. The acquisition model is then extended to adpositions, to examine what, if anything, distributional analysis can tell us about the functional sequences of PPs. The results confirm the theoretical position according to which spatiotemporal Ps are lexical in character, rooting their own extended projections, and that functional Ps express an aspectual sequence in the functional superstructure of the PP

    Classification-based phrase structure grammar: an extended revised version of HPSG

    Get PDF
    This thesis is concerned with a presentation of Classification -based Phrase Structure Grammar (or cPSG), a grammatical theory that has grown out of extensive revisions of, and extensions to, HPSG. The fundamental difference between this theory and HPSG concerns the central role that classification plays in the grammar: the grammar classifies strings, according to their feature structure descriptions, as being of various types. Apart from the role of classification, the theory bears a close resemblance to HPSG, though it is by no means a direct translation, including numerous revisions and extensions. A central goal in the development of the theory has been its computational implementation, which is included in the thesis.The presentation may be divided into four parts. In the first, chapters 1 and 2, we present the grammatical formalism within which the theory is stated. This consists of a development of the notion of a classificatory system (chapter 1), and the incorporation of hierarchality into that notion (chapter 2).The second part concerns syntactic issues. Chapter 3 revises the HPSG treatment of specifiers, complements and adjuncts, incorporating ideas that specifiers and complements should be distinguished and presenting a treatment of adjuncts whereby the head is selected for by the adjunct. Chapter 4 presents several options for an account of unbounded dependencies. The accounts are based loosely on that of GPSG, and a reconstruction of GPSG's Foot Feature Principle is presented which does not involve a notion of default. Chapter 5 discusses coordination, employing an extension of Rounds- Kasper logic to allow a treatment of cross -categorial coordination.In the third part, chapters 6, 7 and 8, we turn to semantic issues. We begin (Chapter 6) with a discussion of Situation Theory, the background semantic theory, attempting to establish a precise and coherent version of the theory within which to work. Chapter 7 presents the bulk of the treatment of semantics, and can be seen as an extensive revision of the HPSG treatment of semantics. The aim is to provide a semantic treatment which is faithful to the version of Situation Theory presented in Chapter 6. Chapter 8 deals with quantification, discussing the nature of quantification in Situation Theory before presenting a treatment of quantification in CPSG. Some residual questions about the semantics of coordinated noun phrases are also addressed in this chapter.The final part, Chapter 9, concerns the actual computational implementation of the theory. A parsing algorithm based on hierarchical classification is presented, along with four strategies that might be adopted given that algorithm. Also discussed are some implementation details. A concluding chapter summarises the arguments of the thesis and outlines some avenues for future research

    Reflexives and tree unification grammar

    Get PDF

    Incremental constraint-based parsing: an efficient approach for head-final languages

    Get PDF
    In this dissertation, I provide a left-to-right incremental parsing approach for Headdriven Phrase Structure Grammar (HPSG; Pollard and Sag (1987, 1994)). HPSG is a lexicalized, constraint-based theory of grammar, which has also been widely exploited in computational linguistics in recent years. Head-final languages are known to pose problems for the incrementality of head-driven parsing models, proposed for parsing with constraint-based grammar formalisms, in both psycholinguistics and computational linguistics. Therefore, here I further focus my attention on processing a head-final language, specifically Turkish, to highlight any challenges that may arise in the case of such a language. The dissertation makes two principal contributions, the first part mainly providing the theoretical treatment required for the computational approach presented in the second part. The first part of the dissertation is concerned with the analysis of certain phenomena in Turkish grammar within the frame..
    corecore