5 research outputs found
Auditory Cortex Tracks Both Auditory and Visual Stimulus Dynamics Using Low-Frequency Neuronal Phase Modulation
How is naturalistic multisensory information combined in the human brain? Based on MEG data we show that phase modulation of visual and auditory signals captures the dynamics of complex scenes
Superior temporal sulcus - It's my area: or is it?
The superior temporal sulcus (STS) is the chameleon of the human brain. Several research areas claim the STS as the host brain region for their particular behavior of interest. Some see it as one of the core structures for theory of mind. For others, it is the main region for audiovisual integration. It plays an important role in biological motion perception, but is also claimed to be essential for speech processing and processing of faces. We review the foci of activations in the STS from multiple functional magnetic resonance imaging studies, focusing on theory of mind, audiovisual integration, motion processing, speech processing, and face processing. The results indicate a differentiation of the STS region in an anterior portion, mainly involved in speech processing, and a posterior portion recruited by cognitive demands of all these different research areas. The latter finding argues against a strict functional subdivision of the STS. In line with anatomical evidence from tracer studies, we propose that the function of the STS varies depending on the nature of network coactivations with different regions in the frontal cortex and medial-temporal lobe. This view is more in keeping with the notion that the same brain region can support different cognitive operations depending on task-dependent network connections, emphasizing the role of network connectivity analysis in neuroimaging
Temporal characteristics of audiovisual information processing
In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices