152 research outputs found

    The Non-Hierarchical Nature of the Chomsky Hierarchy-Driven Artificial-Grammar Learning

    Get PDF
    Recent artificial-grammar learning (AGL) paradigms driven by the Chomsky hierarchy paved the way for direct comparisons between humans and animals in the learning of center embedding ([A[AB]B]). The AnBn grammars used by the first generation of such research lacked a crucial property of center embedding, where the pairs of elements are explicitly matched ([A1 [A2 B2] B1]). This type of indexing is implemented in the second-generation AnBn grammars. This paper reviews recent studies using such grammars. Against the premises of these studies, we argue that even those newer AnBn grammars cannot test the learning of syntactic hierarchy. These studies nonetheless provide detailed information about the conditions under which human adults can learn an AnBn grammar with indexing. This knowledge serves to interpret recent animal studies, which make surprising claims about animals’ ability to handle center embedding

    Implicit learning of recursive context-free grammars

    Get PDF
    Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have not assessed awareness and implicitness. This paper explores the implicit learning of context-free grammars employing features of hierarchical organization, recursive embedding and long-distance dependencies. The grammars also featured the distinction between left- and right-branching structures, as well as between centre- and tail-embedding, both distinctions found in natural languages. People acquired unconscious knowledge of relations between grammatical classes even for dependencies over long distances, in ways that went beyond learning simpler relations (e.g. n-grams) between individual words. The structural distinctions drawn from linguistics also proved important as performance was greater for tail-embedding than centre-embedding structures. The results suggest the plausibility of implicit learning of complex context-free structures, which model some features of natural languages. They support the relevance of artificial grammar learning for probing mechanisms of language learning and challenge existing theories and computational models of implicit learning

    The learnability of center-embedded recursion : experimental studies with artificial and natural language

    Get PDF
    In the present thesis, first, the principle of center-embedded (CE) recursion is explained. Next, we briefly discuss animal studies on recursion learning, followed by a section with theories and experimental evidence about human learning. Here, the complexity of the principle is contrasted with pragmatic learning strategies. Then, we discuss the features of the input that might help recursion learning. Finally, we discuss the methodological issues regarding the use of artificial language to study aspects of natural language learning.NWO grantAction Contro

    The language faculty that wasn't : a usage-based account of natural language recursion

    Get PDF
    In the generative tradition, the language faculty has been shrinking—perhaps to include only the mechanism of recursion. This paper argues that even this view of the language faculty is too expansive. We first argue that a language faculty is difficult to reconcile with evolutionary considerations. We then focus on recursion as a detailed case study, arguing that our ability to process recursive structure does not rely on recursion as a property of the grammar, but instead emerges gradually by piggybacking on domain-general sequence learning abilities. Evidence from genetics, comparative work on non-human primates, and cognitive neuroscience suggests that humans have evolved complex sequence learning skills, which were subsequently pressed into service to accommodate language. Constraints on sequence learning therefore have played an important role in shaping the cultural evolution of linguistic structure, including our limited abilities for processing recursive structure. Finally, we re-evaluate some of the key considerations that have often been taken to require the postulation of a language faculty

    Children’s Learning of a Semantics-Free Artificial Grammar with Center Embedding

    Get PDF
    Whether non-human animals have an ability to learn and process center embedding, a core property of human language syntax, is still debated. Artificial-grammar learning (AGL) has been used to compare humans and animals in the learning of center embedding. However, up until now, human participants have only included adults, and data on children, who are the key players of natural language acquisition, are lacking. We created a novel game-like experimental paradigm combining the go/no-go procedure often used in animal research with the stepwise learning methods found effective in human adults’ center-embedding learning. Here we report that some children succeeded in learning a semantics-free artificial grammar with center embedding (A2B2 grammar) in the auditory modality. Although their success rate was lower than adults’, the successful children looked as efficient learners as adults. Where children struggled, their memory capacity seemed to have limited their AGL performance

    The evolution of language: Proceedings of the Joint Conference on Language Evolution (JCoLE)

    Get PDF
    • …
    corecore