research

Self-Organized Artificial Grammar Learning in Spiking Neural Networks

Abstract

The Artificial Grammar Learning (AGL) paradigm provides a means to study the nature of syntactic processing and implicit sequence learning. With mere exposure and without perfor-mance feedback, human beings implicitly acquire knowledge about the structural regularities implemented by complex rule systems. We investigate to which extent a generic cortical mi-crocircuit model can support formally explicit symbolic com-putations, instantiated by the same grammars used in the hu-man AGL literature and how a functional network emerges, in a self-organized manner, from exposure to this type of data. We use a concrete implementation of an input-driven recurrent network composed of noisy, spiking neurons, built according to the reservoir computing framework and dynamically shaped by a variety of synaptic and intrinsic plasticity mechanisms operating concomitantly. We show that, when shaped by plas-ticity, these models are capable of acquiring the structure of a simple grammar. When asked to judge string legality (in a manner similar to human subjects), the networks perform at a qualitatively comparable level

    Similar works