6,645 research outputs found

    From hierarchies to features : person splits and direct-inverse alternations

    Get PDF
    In the recent literature there is growing interest in the morpho-syntactic encoding of hierarchical effects. The paper investigates one domain where such effects are attested: ergative splits conditioned by person. This type of splits is then compared to hierarchical effects in direct-inverse alternations. On the basis of two case studies (Lummi instantiating an ergative split person language and Passamaquoddy an inverse language) we offer an account that makes no use of hierarchies as a primitive. We propose that the two language types differ as far as the location of person features is concerned. In inverse systems person features are located exclusively in T, while in ergative systems, they are located in T and a particular type of v. A consequence of our analysis is that Case checking in split and inverse systems is guided by the presence/absence of specific phi-features. This in turn provides evidence for a close connection between Case and phi-features, reminiscent of Chomsky’s (2000, 2001) Agree

    A matter of time: Implicit acquisition of recursive sequence structures

    Get PDF
    A dominant hypothesis in empirical research on the evolution of language is the following: the fundamental difference between animal and human communication systems is captured by the distinction between regular and more complex non-regular grammars. Studies reporting successful artificial grammar learning of nested recursive structures and imaging studies of the same have methodological shortcomings since they typically allow explicit problem solving strategies and this has been shown to account for the learning effect in subsequent behavioral studies. The present study overcomes these shortcomings by using subtle violations of agreement structure in a preference classification task. In contrast to the studies conducted so far, we use an implicit learning paradigm, allowing the time needed for both abstraction processes and consolidation to take place. Our results demonstrate robust implicit learning of recursively embedded structures (context-free grammar) and recursive structures with cross-dependencies (context-sensitive grammar) in an artificial grammar learning task spanning 9 days. Keywords: Implicit artificial grammar learning; centre embedded; cross-dependency; implicit learning; context-sensitive grammar; context-free grammar; regular grammar; non-regular gramma

    Struggling for Structure: cognitive origins of grammatical diversity and their implications for the Human Faculty of Language

    Get PDF
    There are between 5,000 and 8,000 distinct living languages spoken in the world today that are characterized by both exceptional diversity as well as significant similarities. Many researchers believe that at least part of this ability to communicate with language arises from a uniquely human Faculty of Language (c.f. Hauser, Chomsky, & Fitch, 2002; Pinker & Jackendoff, 2005)

    An Introduction to Grammatical Inference for Linguists

    Get PDF
    This paper is meant to be an introductory guide to Grammatical Inference (GI), i.e., the study of machine learning of formal languages. It is designed for non-specialists in Computer Science, but with a special interest in language learning. It covers basic concepts and models developed in the framework of GI, and tries to point out the relevance of these studies for natural language acquisition

    Children as Models for Computers: Natural Language Acquisition for Machine Learning

    No full text
    International audienceThis paper focuses on a subïŹeld of machine learning, the so- called grammatical inference. Roughly speaking, grammatical inference deals with the problem of inferring a grammar that generates a given set of sample sentences in some manner that is supposed to be realized by some inference algorithm. We discuss how the analysis and formalization of the main features of the process of human natural language acquisition may improve results in the area of grammatical inference

    On past participle agreement in transitive clauses in French

    Get PDF
    This paper provides a Minimalist analysis of past participle agreement in French in transitive clauses. Our account posits that the head v of vP in such structures carries an (accusativeassigning) structural case feature which may apply (with or without concomitant agreement) to case-mark a clause-mate object, the subject of a defective complement clause, or an intermediate copy of a preposed subject in spec-CP. In structures where a goal is extracted from vP (e.g. via wh-movement) v also carries an edge feature, and may also carry a specificity feature and a set of (number and gender) agreement features. We show how these assumptions account for agreement of a participle with a preposed specific clausemate object or defective-clause subject, and for the absence of agreement with an embedded object, with the complement of an impersonal verb, and with the subject of an embedded (finite or nonfinite) CP complement. We also argue that the absence of agreement marking (in expected contexts) on the participles faitmade and laissélet in infinitive structures is essentially viral in nature. Finally, we claim that obligatory participle agreement with reflexive and reciprocal objects arises because the derivation of reflexives involves A-movement and concomitant agreement

    Language: The missing selection pressure

    Full text link
    Human beings are talkative. What advantage did their ancestors find in communicating so much? Numerous authors consider this advantage to be "obvious" and "enormous". If so, the problem of the evolutionary emergence of language amounts to explaining why none of the other primate species evolved anything even remotely similar to language. What I propose here is to reverse the picture. On closer examination, language resembles a losing strategy. Competing for providing other individuals with information, sometimes striving to be heard, makes apparently no sense within a Darwinian framework. At face value, language as we can observe it should never have existed or should have been counter-selected. In other words, the selection pressure that led to language is still missing. The solution I propose consists in regarding language as a social signaling device that developed in a context of generalized insecurity that is unique to our species. By talking, individuals advertise their alertness and their ability to get informed. This hypothesis is shown to be compatible with many characteristics of language that otherwise are left unexplained.Comment: 34 pages, 3 figure
    • 

    corecore