63 research outputs found

    Changes in Early Cortical Visual Processing Predict Enhanced Reactivity in Deaf Individuals

    Get PDF
    Individuals with profound deafness rely critically on vision to interact with their environment. Improvement of visual performance as a consequence of auditory deprivation is assumed to result from cross-modal changes occurring in late stages of visual processing. Here we measured reaction times and event-related potentials (ERPs) in profoundly deaf adults and hearing controls during a speeded visual detection task, to assess to what extent the enhanced reactivity of deaf individuals could reflect plastic changes in the early cortical processing of the stimulus. We found that deaf subjects were faster than hearing controls at detecting the visual targets, regardless of their location in the visual field (peripheral or peri-foveal). This behavioural facilitation was associated with ERP changes starting from the first detectable response in the striate cortex (C1 component) at about 80 ms after stimulus onset, and in the P1 complex (100–150 ms). In addition, we found that P1 peak amplitudes predicted the response times in deaf subjects, whereas in hearing individuals visual reactivity and ERP amplitudes correlated only at later stages of processing. These findings show that long-term auditory deprivation can profoundly alter visual processing from the earliest cortical stages. Furthermore, our results provide the first evidence of a co-variation between modified brain activity (cortical plasticity) and behavioural enhancement in this sensory-deprived population

    Does congenital deafness affect the structural and functional architecture of primary visual cortex?

    Get PDF
    Deafness results in greater reliance on the remaining senses. It is unknown whether the cortical architecture of the intact senses is optimized to compensate for lost input. Here we performed widefield population receptive field (pRF) mapping of primary visual cortex (V1) with functional magnetic resonance imaging (fMRI) in hearing and congenitally deaf participants, all of whom had learnt sign language after the age of 10 years. We found larger pRFs encoding the peripheral visual field of deaf compared to hearing participants. This was likely driven by larger facilitatory center zones of the pRF profile concentrated in the near and far periphery in the deaf group. pRF density was comparable between groups, indicating pRFs overlapped more in the deaf group. This could suggest that a coarse coding strategy underlies enhanced peripheral visual skills in deaf people. Cortical thickness was also decreased in V1 in the deaf group. These findings suggest deafness causes structural and functional plasticity at the earliest stages of visual cortex

    Auditory Cortex Basal Activity Modulates Cochlear Responses in Chinchillas

    Get PDF
    Background: The auditory efferent system has unique neuroanatomical pathways that connect the cerebral cortex with sensory receptor cells. Pyramidal neurons located in layers V and VI of the primary auditory cortex constitute descending projections to the thalamus, inferior colliculus, and even directly to the superior olivary complex and to the cochlear nucleus. Efferent pathways are connected to the cochlear receptor by the olivocochlear system, which innervates outer hair cells and auditory nerve fibers. The functional role of the cortico-olivocochlear efferent system remains debated. We hypothesized that auditory cortex basal activity modulates cochlear and auditory-nerve afferent responses through the efferent system. Methodology/Principal Findings: Cochlear microphonics (CM), auditory-nerve compound action potentials (CAP) and auditory cortex evoked potentials (ACEP) were recorded in twenty anesthetized chinchillas, before, during and after auditory cortex deactivation by two methods: lidocaine microinjections or cortical cooling with cryoloops. Auditory cortex deactivation induced a transient reduction in ACEP amplitudes in fifteen animals (deactivation experiments) and a permanent reduction in five chinchillas (lesion experiments). We found significant changes in the amplitude of CM in both types of experiments, being the most common effect a CM decrease found in fifteen animals. Concomitantly to CM amplitude changes, we found CAP increases in seven chinchillas and CAP reductions in thirteen animals. Although ACE

    A Novel Interhemispheric Interaction: Modulation of Neuronal Cooperativity in the Visual Areas

    Get PDF
    Background: The cortical representation of the visual field is split along the vertical midline, with the left and the right hemi-fields projecting to separate hemispheres. Connections between the visual areas of the two hemispheres are abundant near the representation of the visual midline. It was suggested that they re-establish the functional continuity of the visual field by controlling the dynamics of the responses in the two hemispheres. Methods/Principal Findings: To understand if and how the interactions between the two hemispheres participate in processing visual stimuli, the synchronization of responses to identical or different moving gratings in the two hemi-fields were studied in anesthetized ferrets. The responses were recorded by multiple electrodes in the primary visual areas and the synchronization of local field potentials across the electrodes were analyzed with a recent method derived from dynamical system theory. Inactivating the visual areas of one hemisphere modulated the synchronization of the stimulus-driven activity in the other hemisphere. The modulation was stimulus-specific and was consistent with the fine morphology of callosal axons in particular with the spatio-temporal pattern of activity that axonal geometry can generate. Conclusions/Significance: These findings describe a new kind of interaction between the cerebral hemispheres and highlight the role of axonal geometry in modulating aspects of cortical dynamics responsible for stimulus detection and/or categorization

    New approaches to the study of human brain networks underlying spatial attention and related processes

    Get PDF
    Cognitive processes, such as spatial attention, are thought to rely on extended networks in the human brain. Both clinical data from lesioned patients and fMRI data acquired when healthy subjects perform particular cognitive tasks typically implicate a wide expanse of potentially contributing areas, rather than just a single brain area. Conversely, evidence from more targeted interventions, such as transcranial magnetic stimulation (TMS) or invasive microstimulation of the brain, or selective study of patients with highly focal brain damage, can sometimes indicate that a single brain area may make a key contribution to a particular cognitive process. But this in turn raises questions about how such a brain area may interface with other interconnected areas within a more extended network to support cognitive processes. Here, we provide a brief overview of new approaches that seek to characterise the causal role of particular brain areas within networks of several interacting areas, by measuring the effects of manipulations for a targeted area on function in remote interconnected areas. In human participants, these approaches include concurrent TMS-fMRI and TMS-EEG, as well as combination of the focal lesion method in selected patients with fMRI and/or EEG measures of the functional impact from the lesion on interconnected intact brain areas. Such approaches shed new light on how frontal cortex and parietal cortex modulate sensory areas in the service of attention and cognition, for the normal and damaged human brain

    Encoding of Temporal Information by Timing, Rate, and Place in Cat Auditory Cortex

    Get PDF
    A central goal in auditory neuroscience is to understand the neural coding of species-specific communication and human speech sounds. Low-rate repetitive sounds are elemental features of communication sounds, and core auditory cortical regions have been implicated in processing these information-bearing elements. Repetitive sounds could be encoded by at least three neural response properties: 1) the event-locked spike-timing precision, 2) the mean firing rate, and 3) the interspike interval (ISI). To determine how well these response aspects capture information about the repetition rate stimulus, we measured local group responses of cortical neurons in cat anterior auditory field (AAF) to click trains and calculated their mutual information based on these different codes. ISIs of the multiunit responses carried substantially higher information about low repetition rates than either spike-timing precision or firing rate. Combining firing rate and ISI codes was synergistic and captured modestly more repetition information. Spatial distribution analyses showed distinct local clustering properties for each encoding scheme for repetition information indicative of a place code. Diversity in local processing emphasis and distribution of different repetition rate codes across AAF may give rise to concurrent feed-forward processing streams that contribute differently to higher-order sound analysis

    What and how the deaf brain sees

    No full text
    �Over the past decade, there has been an unprecedented level of interest and progress into understanding visual processing in the brain of the deaf. Specifically, when the brain is deprived of input from one sensory modality (such as hearing), it often com-pensates with supranormal performance in one or more of the intact sensory systems (such as vision). Recent psychophysical, functional imaging, and reversible deactivation studies have con-verged to define the specific visual abilities that are enhanced in the deaf, as well as the cortical loci that undergo crossmodal plasticity in the deaf and are responsible for mediating these superior visual functions. Examination of these investigations reveals that central visual functions, such as object and facial discrimination, and peripheral visual functions, such as motion detection, visual localization, visuomotor synchronization, and Vernier acuity (measured in the periphery), are specifically enhanced in the deaf, compared with hearing participants. Furthermore, the cortical loci identified to mediate these functions reside in deaf auditory cortex: BA 41, BA 42, and BA 22, in addition to the rostral area, planum temporale, Te3, and temporal voice area in humans; primary auditory cortex, anterior auditory field, dorsal zone of auditory cortex, auditory field of the anterior ecto-sylvian sulcus, and posterior auditory field in cats; and primary auditory cortex and anterior auditory field in both ferrets and mice. Overall, the findings from these studies show that cross-modal reorganization in auditory cortex of the deaf is responsible for the superior visual abilities of the deaf

    Functional specialization for auditory–spatial processing in the occipital cortex of congenitally blind humans

    No full text
    The study of the congenitally blind (CB) represents a unique opportunity to explore experience-dependant plasticity in a sensory region deprived of its natural inputs since birth. Although several studies have shown occipital regions of CB to be involved in nonvisual processing, whether the functional organization of the visual cortex observed in sighted individuals (SI) is maintained in the rewired occipital regions of the blind has only been recently investigated. In the present functional MRI study, we compared the brain activity of CB and SI processing either the spatial or the pitch properties of sounds carrying information in both domains (i.e., the same sounds were used in both tasks), using an adaptive procedure specifically designed to adjust for performance level. In addition to showing a substantial recruitment of the occipital cortex for sound processing in CB, we also demonstrate that auditory–spatial processing mainly recruits the right cuneus and the right middle occipital gyrus, two regions of the dorsal occipital stream known to be involved in visuospatial/motion processing in SI. Moreover, functional connectivity analyses revealed that these reorganized occipital regions are part of an extensive brain network including regions known to underlie audiovisual spatial abilities (i.e., intraparietal sulcus, superior frontal gyrus). We conclude that some regions of the right dorsal occipital stream do not require visual experience to develop a specialization for the processing of spatial information and to be functionally integrated in a preexisting brain network dedicated to this ability
    corecore