No need to Talk, I Know You: Familiarity Influences Early Multisensory Integration in a Songbird's Brain

Abstract

It is well known that visual information can affect auditory perception, as in the famous “McGurk effect,” but little is known concerning the processes involved. To address this issue, we used the best-developed animal model to study language-related processes in the brain: songbirds. European starlings were exposed to audiovisual compared to auditory-only playback of conspecific songs, while electrophysiological recordings were made in their primary auditory area (Field L). The results show that the audiovisual condition modulated the auditory responses. Enhancement and suppression were both observed, depending on the stimulus familiarity. Seeing a familiar bird led to suppressed auditory responses while seeing an unfamiliar bird led to response enhancement, suggesting that unisensory perception may be enough if the stimulus is familiar while redundancy may be required for unfamiliar items. This is to our knowledge the first evidence that multisensory integration may occur in a low-level, putatively unisensory area of a non-mammalian vertebrate brain, and also that familiarity of the stimuli may influence modulation of auditory responses by vision

    Similar works