16,489 research outputs found

    Contextual modulation of primary visual cortex by auditory signals

    Get PDF
    Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame. This article is part of the themed issue ‘Auditory and visual scene analysis’

    Review: Do the Different Sensory Areas within the Cat Anterior Ectosylvian Sulcal Cortex Collectively Represent a Network Multisensory Hub?

    Get PDF
    Current theory supports that the numerous functional areas of the cerebral cortex are organized and function as a network. Using connectional databases and computational approaches, the cerebral network has been demonstrated to exhibit a hierarchical structure composed of areas, clusters and, ultimately, hubs. Hubs are highly connected, higher-order regions that also facilitate communication between different sensory modalities. One region computationally identified network hub is the visual area of the Anterior Ectosylvian Sulcal cortex (AESc) of the cat. The Anterior Ectosylvian Visual area (AEV) is but one component of the AESc that also includes the auditory (Field of the Anterior Ectosylvian Sulcus - FAES) and somatosensory (Fourth somatosensory representation - SIV). To better understand the nature of cortical network hubs, the present report reviews the biological features of the AESc. Within the AESc, each area has extensive external cortical connections as well as among one another. Each of these core representations is separated by a transition zone characterized by bimodal neurons that share sensory properties of both adjoining core areas. Finally, core and transition zones are underlain by a continuous sheet of layer 5 neurons that project to common output structures. Altogether, these shared properties suggest that the collective AESc region represents a multiple sensory/multisensory cortical network hub. Ultimately, such an interconnected, composite structure adds complexity and biological detail to the understanding of cortical network hubs and their function in cortical processing

    Top-down effects on early visual processing in humans: a predictive coding framework

    Get PDF
    An increasing number of human electroencephalography (EEG) studies examining the earliest component of the visual evoked potential, the so-called C1, have cast doubts on the previously prevalent notion that this component is impermeable to top-down effects. This article reviews the original studies that (i) described the C1, (ii) linked it to primary visual cortex (V1) activity, and (iii) suggested that its electrophysiological characteristics are exclusively determined by low-level stimulus attributes, particularly the spatial position of the stimulus within the visual field. We then describe conflicting evidence from animal studies and human neuroimaging experiments and provide an overview of recent EEG and magnetoencephalography (MEG) work showing that initial V1 activity in humans may be strongly modulated by higher-level cognitive factors. Finally, we formulate a theoretical framework for understanding top-down effects on early visual processing in terms of predictive coding

    Neural Correlates of Auditory Perceptual Awareness and Release from Informational Masking Recorded Directly from Human Cortex: A Case Study.

    Get PDF
    In complex acoustic environments, even salient supra-threshold sounds sometimes go unperceived, a phenomenon known as informational masking. The neural basis of informational masking (and its release) has not been well-characterized, particularly outside auditory cortex. We combined electrocorticography in a neurosurgical patient undergoing invasive epilepsy monitoring with trial-by-trial perceptual reports of isochronous target-tone streams embedded in random multi-tone maskers. Awareness of such masker-embedded target streams was associated with a focal negativity between 100 and 200 ms and high-gamma activity (HGA) between 50 and 250 ms (both in auditory cortex on the posterolateral superior temporal gyrus) as well as a broad P3b-like potential (between ~300 and 600 ms) with generators in ventrolateral frontal and lateral temporal cortex. Unperceived target tones elicited drastically reduced versions of such responses, if at all. While it remains unclear whether these responses reflect conscious perception, itself, as opposed to pre- or post-perceptual processing, the results suggest that conscious perception of target sounds in complex listening environments may engage diverse neural mechanisms in distributed brain areas

    The sound of concepts: The link between auditory and conceptual brain systems

    Get PDF
    Concepts in long-term memory are important building blocks of human cognition and are the basis for object recognition, language and thought. While it is well accepted that concepts are comprised of features related to sensory object attributes, it is still unclear how these features are represented in the brain. Of central interest is whether concepts are essentially grounded in perception. This would imply a common neuroanatomical substrate for perceptual and conceptual processing. Here we show using functional magnetic resonance imaging and recordings of event-related potentials that acoustic conceptual features rapidly recruit auditory areas even when implicitly presented through visual words. Recognizing words denoting objects for which acoustic features are highly relevant (e.g. "telephone") suffices to ignite cell assemblies in the posterior superior and middle temporal gyrus (pSTG/MTG) that were also activated by listening to real sounds. Activity in pSTG/MTG had an onset of 150 ms and increased parametrically as a function of acoustic feature relevance. Both findings suggest a conceptual origin of this effect rather than post-conceptual strategies such as imagery. The presently demonstrated link between auditory and conceptual brain systems parallels observations in other memory systems suggesting that modality-specificity represents a general organizational principle in cortical memory representation. The understanding of concepts as a partial reinstatement of brain activity during perception stresses the necessity of rich sensory experiences for concept acquisition. The modality-specific nature of concepts could also explain the difficulties in achieving a consensus about overall definitions of abstract concepts such as freedom or justice unless embedded in a concrete, experienced situation

    The human 'pitch center' responds differently to iterated noise and Huggins pitch

    Get PDF
    A magnetoencephalographic marker for pitch analysis (the pitch onset response) has been reported for different types of pitch-evoking stimuli, irrespective of whether the acoustic cues for pitch are monaurally or binaurally produced. It is claimed that the pitch onset response reflects a common cortical representation for pitch, putatively in lateral Heschl's gyrus. The result of this functional MRI study sheds doubt on this assertion. We report a direct comparison between iterated ripple noise and Huggins pitch in which we reveal a different pattern of auditory cortical activation associated with each pitch stimulus, even when individual variability in structure-function relations is accounted for. Our results suggest it may be premature to assume that lateral Heschl's gyrus is a universal pitch center

    High-frequency neural oscillations and visual processing deficits in schizophrenia

    Get PDF
    Visual information is fundamental to how we understand our environment, make predictions, and interact with others. Recent research has underscored the importance of visuo-perceptual dysfunctions for cognitive deficits and pathophysiological processes in schizophrenia. In the current paper, we review evidence for the relevance of high frequency (beta/gamma) oscillations towards visuo-perceptual dysfunctions in schizophrenia. In the first part of the paper, we examine the relationship between beta/gamma band oscillations and visual processing during normal brain functioning. We then summarize EEG/MEG-studies which demonstrate reduced amplitude and synchrony of high-frequency activity during visual stimulation in schizophrenia. In the final part of the paper, we identify neurobiological correlates as well as offer perspectives for future research to stimulate further inquiry into the role of high-frequency oscillations in visual processing impairments in the disorder
    corecore