24 research outputs found

    Crossmodal reorganisation in deafness: mechanisms for functional preservation and functional change

    Get PDF
    The study of deafness and blindness has contributed unique knowledge to our understanding of the brain, showing that environmental experience critically shapes neural structure and function. Nevertheless, the most prevalent theories of crossmodal plasticity propose opposing views about the function of reorganised cortical regions. Some theories agree on functional preservation, where in the absence of early sensory stimulation, cortical regions respond to a different sensory modality, but perform the same function. Others propose that the absence of sensory stimulation from birth results in cortical regions changing their “typical” sensory processing function to higher-order cognition. Both deafness and blindness have provided vast evidence in support of each of these theories. Here we use examples from the study of deafness to explore organisational mechanisms that would allow functional preservation and functional change to co-exist either in the same or adjacent regions. We provide a set of predictions and testable hypotheses that support each of these accounts, and lay out some steps that could move us towards more specific theories of cortical reorganisation

    Monitoring different phonological parameters of sign language engages the same cortical language network but distinctive perceptual ones

    Get PDF
    The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production

    Sensory experience modulates the reorganization of auditory regions for executive processing

    Get PDF
    Crossmodal plasticity refers to the reorganization of sensory cortices in the absence of their typical main sensory input. Understanding this phenomenon provides insights into brain function and its potential for change and enhancement. Using functional MRI, we investigated how early deafness influences crossmodal plasticity and the organization of executive functions in the adult human brain. Deaf (n = 25; age: mean = 41.68, range = 19–66, SD = 14.38; 16 female, 9 male) and hearing (n = 20; age: mean = 37.50, range = 18–66, SD = 16.85; 15 female, 5 male) participants performed four visual tasks tapping into different components of executive processing: task switching, working memory, planning and inhibition. Our results show that deaf individuals specifically recruit ‘auditory’ regions during task switching. Neural activity in superior temporal regions, most significantly in the right hemisphere, are good predictors of behavioural performance during task switching in the group of deaf individuals, highlighting the functional relevance of the observed cortical reorganization. Our results show executive processing in typically sensory regions, suggesting that the development and ultimate role of brain regions are influenced by perceptual environmental experience

    Human V6: functional characterisation and localisation.

    Get PDF
    Human visual area V6, in the parieto-occipital sulcus, is thought to have an important role in the extraction of optic flow for the monitoring and guidance of self-motion (egomotion) because it responds differentially to egomotion-compatible optic flow when compared to: (a) coherent but egomotion-incompatible flow (Cardin & Smith, 2010), and (b) incoherent motion (Pitzalis et al., 2010). It is not clear, however, whether V6 responds more strongly to egomotion-incompatible global motion than to incoherent motion. This is relevant not only for determining the functional properties of V6, but also in order to choose optimal stimuli for localising V6 accurately with fMRI. Localisation with retinotopic mapping is difficult and there is a need for a simple, reliable method. We conducted an event-related 3T fMRI experiment in which participants viewed a display of dots which either: a) followed a time-varying optic flow trajectory in a single, egomotion-compatible (EC) display; b) formed an egomotion-incompatible (EI) 3 × 3 array of optic flow patches; or c) moved randomly (RM). Results from V6 show an ordering of response magnitudes: EC > EI > RM. Neighbouring areas V3A and V7 responded more strongly to EC than to RM, but about equally to EC and EI. Our results suggest that although V6 may have a general role in the extraction of global motion, in clear contrast to neighbouring motion areas it is especially concerned with encoding EC stimuli. They suggest two strategies for localising V6: (1) contrasting EC and EI; or (2) contrasting EC and RM, which is more sensitive but carries a risk of including voxels from neighbouring regions that also show a EC > RM preference

    Differential activity in Heschl's gyrus between deaf and hearing individuals is due to auditory deprivation rather than language modality

    Get PDF
    Sensory cortices undergo crossmodal reorganisation as a consequence of sensory deprivation. Congenital deafness in humans represents a particular case with respect to other types of sensory deprivation, because cortical reorganisation is not only a consequence of auditory deprivation, but also of language-driven mechanisms. Visual crossmodal plasticity has been found in secondary auditory cortices of deaf individuals, but it is still unclear if reorganisation also takes place in primary auditory areas, and how this relates to language modality and auditory deprivation.  Here, we dissociated the effects of language modality and auditory deprivation on crossmodal plasticity in Heschl's gyrus as a whole, and in cytoarchitectonic region Te1.0 (likely to contain the core auditory cortex). Using fMRI, we measured the BOLD response to viewing sign language in congenitally or early deaf individuals with and without sign language knowledge, and in hearing controls.  Results show that differences between hearing and deaf individuals are due to a reduction in activation caused by visual stimulation in the hearing group, which is more significant in Te1.0 than in Heschl's gyrus as a whole. Furthermore, differences between deaf and hearing groups are due to auditory deprivation, and there is no evidence that the modality of language used by deaf individuals contributes to crossmodal plasticity in Heschl's gyrus

    Top-down Modulations in the Visual Form Pathway Revealed with Dynamic Causal Modeling

    Get PDF
    Perception entails interactions between activated brain visual areas and the records of previous sensations, allowing for processes like figure–ground segregation and object recognition. The aim of this study was to characterize top-down effects that originate in the visual cortex and that are involved in the generation and perception of form. We performed a functional magnetic resonance imaging experiment, where subjects viewed 3 groups of stimuli comprising oriented lines with different levels of recognizable high-order structure (none, collinearity, and meaning). Our results showed that recognizable stimuli cause larger activations in anterior visual and frontal areas. In contrast, when stimuli are random or unrecognizable, activations are greater in posterior visual areas, following a hierarchical organization where areas V1/V2 were less active with “collinearity” and the middle occipital cortex was less active with “meaning.” An effective connectivity analysis using dynamic causal modeling showed that high-order visual form engages higher visual areas that generate top-down signals, from multiple levels of the visual hierarchy. These results are consistent with a model in which if a stimulus has recognizable attributes, such as collinearity and meaning, the areas specialized for processing these attributes send top-down messages to the lower levels to facilitate more efficient encoding of visual form

    Effects of Aging and Adult-Onset Hearing Loss on Cortical Auditory Regions

    Get PDF
    Hearing loss is a common feature in human aging. It has been argued that dysfunctions in central processing are important contributing factors to hearing loss during older age. Aging also has well documented consequences for neural structure and function, but it is not clear how these effects interact with those that arise as a consequence of hearing loss. This paper reviews the effects of aging and adult-onset hearing loss in the structure and function of cortical auditory regions. The evidence reviewed suggests that aging and hearing loss result in atrophy of cortical auditory regions and stronger engagement of networks involved in the detection of salient events, adaptive control and re-allocation of attention. These cortical mechanisms are engaged during listening in effortful conditions in normal hearing individuals. Therefore, as a consequence of aging and hearing loss, all listening becomes effortful and cognitive load is constantly high, reducing the amount of available cognitive resources. This constant effortful listening and reduced cognitive spare capacity could be what accelerates cognitive decline in older adults with hearing loss

    What is the function of auditory cortex without auditory input?

    Get PDF
    This scientific commentary refers to ‘Cross-modal activation of auditory regions during visuo-spatial working memory in early deafness’, by Ding et al. (doi:10.1093/brain/awv165)

    The Form Pathways in the Visua Brain

    No full text
    EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    What is the function of auditory cortex without auditory input?

    No full text
    corecore