Modulatory effects of linguistic aspects on cortical tracking of speech

Abstract

Comprehending speech is a very challenging problem that the human brain solves. Phase alignment between low-frequency cortical oscillations and amplitude modulations in speech (known as 'speech tracking') can resolve certain neurocomputational mechanisms of speech perception, e.g., syllable extraction and phonemic processing. Therefore, speech tracking may be a bottom-up, stimulus-driven mechanism that reflects the processing of speech acoustics. However, efficient speech perception requires the integration of both sensory information embedded in the speech stimulus and top-down influences such as attention and complementary visual information. Yet, the contribution of linguistic aspects in speech tracking responses is poorly investigated. We explored this by comparing speech tracking responses, measured by electroencephalography, from listeners having differential prior experience with the English language. The results suggest that speech tracking responses are not only resulted from bottom-up acoustical processing of speech input but are also modulated by top-down mechanisms learned through deep familiarity with a language.- Natural Sciences and Engineering Research Council of Canada (NSERC) - NSERC Collaborative Research and Training Experience (CREATE) program [Biological Information Processing: From Genome to Systems Level program

    Similar works