8,052 research outputs found

    Logical model of competence and performance in the human sentence processor

    Get PDF

    Degraded acceptability and markedness in syntax, and the stochastic interpretation of optimality theory

    Get PDF
    The argument that I tried to elaborate on in this paper is that the conceptual problem behind the traditional competence/performance distinction does not go away, even if we abandon its original Chomskyan formulation. It returns as the question about the relation between the model of the grammar and the results of empirical investigations – the question of empirical verification The theoretical concept of markedness is argued to be an ideal correlate of gradience. Optimality Theory, being based on markedness, is a promising framework for the task of bridging the gap between model and empirical world. However, this task not only requires a model of grammar, but also a theory of the methods that are chosen in empirical investigations and how their results are interpreted, and a theory of how to derive predictions for these particular empirical investigations from the model. Stochastic Optimality Theory is one possible formulation of a proposal that derives empirical predictions from an OT model. However, I hope to have shown that it is not enough to take frequency distributions and relative acceptabilities at face value, and simply construe some Stochastic OT model that fits the facts. These facts first of all need to be interpreted, and those factors that the grammar has to account for must be sorted out from those about which grammar should have nothing to say. This task, to my mind, is more complicated than the picture that a simplistic application of (not only) Stochastic OT might draw

    Grammars and Processors

    Get PDF
    The paper discusses the role of grammars in sentence processing, and explores some consequences of the Strong Competence Hypothesis of Bresnan and Kaplan for combinatory theories of grammar

    A Computational Model of Syntactic Processing: Ambiguity Resolution from Interpretation

    Get PDF
    Syntactic ambiguity abounds in natural language, yet humans have no difficulty coping with it. In fact, the process of ambiguity resolution is almost always unconscious. But it is not infallible, however, as example 1 demonstrates. 1. The horse raced past the barn fell. This sentence is perfectly grammatical, as is evident when it appears in the following context: 2. Two horses were being shown off to a prospective buyer. One was raced past a meadow. and the other was raced past a barn. ... Grammatical yet unprocessable sentences such as 1 are called `garden-path sentences.' Their existence provides an opportunity to investigate the human sentence processing mechanism by studying how and when it fails. The aim of this thesis is to construct a computational model of language understanding which can predict processing difficulty. The data to be modeled are known examples of garden path and non-garden path sentences, and other results from psycholinguistics. It is widely believed that there are two distinct loci of computation in sentence processing: syntactic parsing and semantic interpretation. One longstanding controversy is which of these two modules bears responsibility for the immediate resolution of ambiguity. My claim is that it is the latter, and that the syntactic processing module is a very simple device which blindly and faithfully constructs all possible analyses for the sentence up to the current point of processing. The interpretive module serves as a filter, occasionally discarding certain of these analyses which it deems less appropriate for the ongoing discourse than their competitors. This document is divided into three parts. The first is introductory, and reviews a selection of proposals from the sentence processing literature. The second part explores a body of data which has been adduced in support of a theory of structural preferences --- one that is inconsistent with the present claim. I show how the current proposal can be specified to account for the available data, and moreover to predict where structural preference theories will go wrong. The third part is a theoretical investigation of how well the proposed architecture can be realized using current conceptions of linguistic competence. In it, I present a parsing algorithm and a meaning-based ambiguity resolution method.Comment: 128 pages, LaTeX source compressed and uuencoded, figures separate macros: rotate.sty, lingmacros.sty, psfig.tex. Dissertation, Computer and Information Science Dept., October 199

    The neurocognition of syntactic processing

    Get PDF

    A New Perspective for Second Language Acquisition: Parsing

    Get PDF
    The focus of this paper is on the possible contribution of research in Parsing Theory to the field of Second Language Acquisition. The aim of this paper is an examination of the relationship between the parsing mechanism and the process of language acquisition, and, more specifically, second language acquisition. The suggestion is made that this framework is a valid and an interesting one to pursue in that it might provide evidence to support current hypotheses in Parsing Theory and in Second Language Acquisition.Cet article veut dĂ©montrer l’apport potentiel des Ă©tudes portant sur les processus de dĂ©codage syntaxique (parsing) Ă  la recherche en acquisition des langues secondes (L2). Il sera question, dans un premier temps, de faire Ă©tat des liens pouvant exister entre le processus d’acquisition linguistique et le processus de dĂ©codage syntaxique en mettant l’accent sur l’acquisition d’une langue seconde. Cet article tente de dĂ©montrer que ce cadre de recherche est valide et intĂ©ressant du fait qu’il pourrait aider Ă  confirmer ou Ă  infirmer certaines hypothĂšses proposĂ©es par des Ă©tudes sur l’acquisition d’une langue seconde et sur la thĂ©orie du dĂ©codage syntaxique

    Natural Language Processing

    Get PDF
    The subject of Natural Language Processing can be considered in both broad and narrow senses. In the broad sense, it covers processing issues at all levels of natural language understanding, including speech recognition, syntactic and semantic analysis of sentences, reference to the discourse context (including anaphora, inference of referents, and more extended relations of discourse coherence and narrative structure), conversational inference and implicature, and discourse planning and generation. In the narrower sense, it covers the syntactic and semantic processing sentences to deliver semantic objects suitable for referring, inferring, and the like. Of course, the results of inference and reference may under some circumstances play a part in processing in the narrow sense. But the processes that are characteristic of these other modules are not the primary concern

    G-Complexity, Quantum Computation and Anticipatory Processes

    Get PDF
    • 

    corecore