5 research outputs found

    Selective attention to sound features mediates cross-modal activation of visual cortices.

    Get PDF
    Contemporary schemas of brain organization now include multisensory processes both in low-level cortices as well as at early stages of stimulus processing. Evidence has also accumulated showing that unisensory stimulus processing can result in cross-modal effects. For example, task-irrelevant and lateralised sounds can activate visual cortices; a phenomenon referred to as the auditory-evoked contralateral occipital positivity (ACOP). Some claim this is an example of automatic attentional capture in visual cortices. Other results, however, indicate that context may play a determinant role. Here, we investigated whether selective attention to spatial features of sounds is a determining factor in eliciting the ACOP. We recorded high-density auditory evoked potentials (AEPs) while participants selectively attended and discriminated sounds according to four possible stimulus attributes: location, pitch, speaker identity or syllable. Sound acoustics were held constant, and their location was always equiprobable (50% left, 50% right). The only manipulation was to which sound dimension participants attended. We analysed the AEP data from healthy participants within an electrical neuroimaging framework. The presence of sound-elicited activations of visual cortices depended on the to-be-discriminated, goal-based dimension. The ACOP was elicited only when participants were required to discriminate sound location, but not when they attended to any of the non-spatial features. These results provide a further indication that the ACOP is not automatic. Moreover, our findings showcase the interplay between task-relevance and spatial (un)predictability in determining the presence of the cross-modal activation of visual cortices

    Electrocorticography Evidence of Tactile Responses in Visual Cortices

    Get PDF
    There is ongoing debate regarding the extent to which human cortices are specialized for processing a given sensory input versus a given type of information, independently of the sensory source. Many neuroimaging and electrophysiological studies have reported that primary and extrastriate visual cortices respond to tactile and auditory stimulation, in addition to visual inputs, suggesting these cortices are intrinsically multisensory. In particular for tactile responses, few studies have proven neuronal processes in visual cortex in humans. Here, we assessed tactile responses in both low-level and extrastriate visual cortices using electrocorticography recordings in a human participant. Specifically, we observed significant spectral power increases in the high frequency band (30-100 Hz) in response to tactile stimuli, reportedly associated with spiking neuronal activity, in both low-level visual cortex (i.e. V2) and in the anterior part of the lateral occipital-temporal cortex. These sites were both involved in processing tactile information and responsive to visual stimulation. More generally, the present results add to a mounting literature in support of task-sensitive and sensory-independent mechanisms underlying functions like spatial, motion, and self-processing in the brain and extending from higher-level as well as to low-level cortices

    Joint Encoding of Auditory Timing and Location in Visual Cortex

    No full text
    Co-occurring sounds can facilitate perception of spatially and temporally correspondent visual events. Separate lines of research have identified two putatively distinct neural mechanisms underlying two types of crossmodal facilitations: Whereas crossmodal phase resetting is thought to underlie enhancements based on temporal correspondences, lateralized occipital evoked potentials (ERPs) are thought to reflect enhancements based on spatial correspondences. Here, we sought to clarify the relationship between these two effects to assess whether they reflect two distinct mechanisms or, rather, two facets of the same underlying process. To identify the neural generators of each effect, we examined crossmodal responses to lateralized sounds in visually responsive cortex of 22 patients using electrocorticographic recordings. Auditory-driven phase reset and ERP responses in visual cortex displayed similar topography, revealing significant activity in pericalcarine, inferior occipital-temporal, and posterior parietal cortex, with maximal activity in lateral occipitotemporal cortex (potentially V5/hMT+). Laterality effects showed similar but less widespread topography. To test whether lateralized and nonlateralized components of crossmodal ERPs emerged from common or distinct neural generators, we compared responses throughout visual cortex. Visual electrodes responded to both contralateral and ipsilateral sounds with a contralateral bias, suggesting that previously observed laterality effects do not emerge from a distinct neural generator but rather reflect laterality-biased responses in the same neural populations that produce phase-resetting responses. These results suggest that crossmodal phase reset and ERP responses previously found to reflect spatial and temporal facilitation in visual cortex may reflect the same underlying mechanism. We propose a new unified model to account for these and previous results
    corecore