1 research outputs found

    Event-Related Potentials Associated with Somatosensory Effect in Audio-Visual Speech Perception

    No full text
    International audienceSpeech perception often involves multisensory processing. Although previous studies have demonstrated visual [1, 2] and somatosensory interactions [3, 4] with auditory processing, it is not clear whether somatosensory information can contribute to the processing of audiovisual speech perception. This study explored the neural consequence of somatosensory interactions in audiovisual speech processing. We assessed whether somatosensory orofacial stimulation influenced event-related potentials (ERPs) in response to an audiovisual speech illusion (the McGurk Effect [1]). 64 scalp sites of ERPs were recorded in response to audiovisual speech stimulation and somatosensory stimulation. In the audiovisual condition, an auditory stimulus /ba/ was synchronized with the video of congruent facial motion (the production of /ba/) or incongruent facial motion (the production of the /da/: McGurk condition). These two audiovisual stimulations were randomly presented with and without somatosensory stimulation associated with facial skin deformation. We found ERPs differences associated with the McGurk effect in the presence of the somatosensory conditions. ERPs for the McGurk effect reliably diverge around 280 ms after auditory onset. The results demonstrate a change of cortical potential of audiovisual processing due to somatosensory inputs and suggest that somatosensory information encoding facial motion also influences speech processing
    corecore