1,414 research outputs found

    Questioning The Modality Of The Occipital Lobe

    Get PDF
    This dissertation explores the occipital lobe’s response to non-visual inputs, and whether this responsivity partitions into separate localization and identification pathways as seen with visual inputs. We hypothesized that occipital areas may merely prefer visual inputs, while maintaining similar task-based sensory recruitment in response to other senses. Our secondary hypothesis was that the robust occipital activation seen in late-blind participants stems at least initially from standard connections present even in the typically sighted, and that these standard connections are functionally utilized by the typically sighted in spatially relevant non-visual analyses. Our initial literature review supported our hypotheses that the occipital lobe is a highly plastic, cross-modally responsive area and that recruitment of occipital areas in the blind stems from the strengthening of existing multi-modal connections. To further explore our topic, we conducted meta-analyses on fMRI and PET studies reporting occipital response to non-visual input in congenital/early-blind participants and/or blindfolded but otherwise typically sighted participants. Through these analyses, we noted significant extrastriate activations for blind participants beyond that seen with sighted participants, which lent support to our task-based wiring hypothesis. We also observed common activations between blind and sighted participants, notably including activation in striate cortex, which supported the notion of functional connections to occipital lobe from other sensory inputs regardless of the presence or lack of visual input. Finally, we conducted an fMRI study investigating the effects of short-term blindfolding on occipital responsivity to auditory stimuli in typically sighted participants. We did not observe greater activation in participants blindfolded for 45 minutes than we observed with non-blindfolded participants, but our study did further highlight the functional connections present between non-visual senses and the occipital lobe, and again supported our task-based wiring hypothesis. Overall, we found support for the occipital lobe being multi-modally reactive, even in typically sighted individuals. We also found evidence of task-based wiring being maintained regardless of the sensory modality being responded to, and of the likelihood that these functional non-visual connections are at least initially what give rise to the widespread occipital activation observed with blind participants in response to non-visual stimuli

    Tactile expectancy modulates occipital alpha oscillations in early blindness

    Full text link
    Alpha oscillatory activity is thought to contribute to visual expectancy through the engagement of task-relevant occipital regions. In early blindness, occipital alpha oscillations are systematically reduced, suggesting that occipital alpha depends on visual experience. However, it remains possible that alpha activity could serve expectancy in non-visual modalities in blind people, especially considering that previous research has shown the recruitment of the occipital cortex for non-visual processing. To test this idea, we used electroencephalography to examine whether alpha oscillations reflected a differential recruitment of task-relevant regions between expected and unexpected conditions in two haptic tasks (texture and shape discrimination). As expected, sensor-level analyses showed that alpha suppression in parieto-occipital sites was significantly reduced in early blind individuals compared with sighted participants. The source reconstruction analysis revealed that group differences originated in the middle occipital cortex. In that region, expected trials evoked higher alpha desynchronization than unexpected trials in the early blind group only. Our results support the role of alpha rhythms in the recruitment of occipital areas in early blind participants, and for the first time we show that although posterior alpha activity is reduced in blindness, it remains sensitive to expectancy factors. Our findings therefore suggest that occipital alpha activity is involved in tactile expectancy in blind individuals, serving a similar function to visual anticipation in sighted populations but switched to the tactile modality. Altogether, our results indicate that expectancy-dependent modulation of alpha oscillatory activity does not depend on visual experience. Significance statement: Are posterior alpha oscillations and their role in expectancy and anticipation dependent on visual experience? Our results show that tactile expectancy can modulate posterior alpha activity in blind (but not sighted) individuals through the engagement of occipital regions, suggesting that in early blindness, alpha oscillations maintain their proposed role in visual anticipation but subserve tactile processing. Our findings bring a new understanding of the role that alpha oscillatory activity plays in blindness, contrasting with the view that alpha activity is task unspecific in blind populations

    Somatosensory processing in deaf and deafblind individuals: How does the brain adapt as a function of sensory and linguistic experience? A critical review

    Get PDF
    How do deaf and deafblind individuals process touch? This question offers a unique model to understand the prospects and constraints of neural plasticity. Our brain constantly receives and processes signals from the environment and combines them into the most reliable information content. The nervous system adapts its functional and structural organization according to the input, and perceptual processing develops as a function of individual experience. However, there are still many unresolved questions regarding the deciding factors for these changes in deaf and deafblind individuals, and so far, findings are not consistent. To date, most studies have not taken the sensory and linguistic experiences of the included participants into account. As a result, the impact of sensory deprivation vs. language experience on somatosensory processing remains inconclusive. Even less is known about the impact of deafblindness on brain development. The resulting neural adaptations could be even more substantial, but no clear patterns have yet been identified. How do deafblind individuals process sensory input? Studies on deafblindness have mostly focused on single cases or groups of late-blind individuals. Importantly, the language backgrounds of deafblind communities are highly variable and include the usage of tactile languages. So far, this kind of linguistic experience and its consequences have not been considered in studies on basic perceptual functions. Here, we will provide a critical review of the literature, aiming at identifying determinants for neuroplasticity and gaps in our current knowledge of somatosensory processing in deaf and deafblind individuals

    Space and time in the human brain

    Get PDF

    Spatial Language Processing in the Blind: Evidence for a Supramodal Representation and Cortical Reorganization

    Get PDF
    Neuropsychological and imaging studies have shown that the left supramarginal gyrus (SMG) is specifically involved in processing spatial terms (e.g. above, left of), which locate places and objects in the world. The current fMRI study focused on the nature and specificity of representing spatial language in the left SMG by combining behavioral and neuronal activation data in blind and sighted individuals. Data from the blind provide an elegant way to test the supramodal representation hypothesis, i.e. abstract codes representing spatial relations yielding no activation differences between blind and sighted. Indeed, the left SMG was activated during spatial language processing in both blind and sighted individuals implying a supramodal representation of spatial and other dimensional relations which does not require visual experience to develop. However, in the absence of vision functional reorganization of the visual cortex is known to take place. An important consideration with respect to our finding is the amount of functional reorganization during language processing in our blind participants. Therefore, the participants also performed a verb generation task. We observed that only in the blind occipital areas were activated during covert language generation. Additionally, in the first task there was functional reorganization observed for processing language with a high linguistic load. As the visual cortex was not specifically active for spatial contents in the first task, and no reorganization was observed in the SMG, the latter finding further supports the notion that the left SMG is the main node for a supramodal representation of verbal spatial relations

    A thalamocortical pathway for fast rerouting of tactile information to occipital cortex in congenital blindness

    Get PDF
    In congenitally blind individuals, the occipital cortex responds to various nonvisual inputs. Some animal studies raise the possibility that a subcortical pathway allows fast re-routing of tactile information to the occipital cortex, but this has not been shown in humans. Here we show using magnetoencephalography (MEG) that tactile stimulation produces occipital cortex activations, starting as early as 35 ms in congenitally blind individuals, but not in blindfolded sighted controls. Given our measured thalamic response latencies of 20 ms and a mean estimated lateral geniculate nucleus to primary visual cortex transfer time of 15 ms, we claim that this early occipital response is mediated by a direct thalamo-cortical pathway. We also observed stronger directed connectivity in the alpha band range from posterior thalamus to occipital cortex in congenitally blind participants. Our results strongly suggest the contribution of a fast thalamo-cortical pathway in the cross-modal activation of the occipital cortex in congenitally blind humans

    How Does Experience Modulate Auditory Spatial Processing in Individuals with Blindness?

    Get PDF
    published_or_final_versio

    Seeing shapes and hearing textures: Two neural categories of touch

    Get PDF
    Touching for shape recognition has been shown to activate occipital areas in addition to somatosensory areas. In this study we asked if this combination of somatosensory and other sensory processing areas also exist in other kinds of touch recognition. In particular, does touch for texture roughness matching activate other sensory processing areas apart from somatosensory areas? We addressed this question with functional magnetic resonance imaging (fMRI) using wooden abstract stimulus objects whose shape or texture were to be identified. The participants judged if pairs of objects had the same shape or the same texture. We found that the activated brain areas for texture and shape matching have similar underlying structures, a combination of the primary motor area and somatosensory areas. Areas associated with object-shape processing were activated between stimuli during shape matching and not texture roughness matching, while auditory areas were activated during encoding of texture and not for shape stimuli. Matching of textures also involves left BA47, an area associated with retrieval of relational information. We suggest that texture roughness is recognized in a framework of ordering. Left-lateralized activations favoring texture might reflect semantic processing associated with grading roughness quantitatively, as opposed to the more qualitative distinctions between shapes.publishedVersio

    Space, time and motion in a multisensory world

    Get PDF
    When interacting with environmental events, humans acquire information from different senses and combine these inputs within a coherent representation of the world. The present doctoral thesis aims at investigating how humans represent space, time, and motion through auditory and visual sensory modalities. It has been widely demonstrated a predisposition of different sensory systems towards the processing of different domains of representation, with hearing that prevails in representing the time domain and vision that is the most reliable sense for processing the space domain. Given this strong link between sensory modality and domain of representation, one objective of this thesis is to deepen the knowledge of the neural organization of multisensory spatial and temporal skills in healthy adults. In addition, by using blindness as a model to unravel the role of vision in the development of spatio-temporal abilities, this thesis explores the interaction of the spatial and temporal domains in the acoustic motion perception of early blind individuals. The interplay between space and time has also been explained as the result of humans performing actions in the surrounding environment since to carry out goal-directed motor behaviors it is useful for a person to associate the spatial and temporal information of one’s target into a shared mental map. In this regard, the present project also questions how the brain processes spatio-temporal cues of external events when it comes to manually intercepting moving objects with one hand. Finally, in light of the above results, this dissertation incorporates the development of a novel portable device, named MultiTab, for the behavioral evaluation of the processing of space, time, and motor responses, through the visual and acoustic sensory modality. For the purposes of this thesis, four methodological approaches have been employed: i) electroencephalogram (EEG) technique, to explore the cortical activation associated with multisensory spatial and temporal tasks; ii) psychophysical methods, to measure the relationship between stimuli in motion and the acoustic speed perception of blind and sighted individuals; iii) motion capture techniques, to measure indices of movements during an object’s interception task; iv) design and technical-behavioral validation of a new portable device. Studies of the present dissertation indicate the following results. First, this thesis highlights an early cortical gain modulation of sensory areas that depends on the domain of representation to process, with auditory areas mainly involved in the multisensory processing of temporal inputs, and visual areas of spatial inputs. Moreover, for the spatial domain specifically, the neural modulation of visual areas is also influenced by the kind of spatial layout representing multisensory stimuli. Second, this project shows that lack of vision influences the ability to process the speed of moving sounds by altering how blind individuals make use of the sounds’ temporal features. This result suggests that visual experience in the first years of life is a crucial factor when dealing with combined spatio-temporal information. Third, data of this thesis demonstrate that typically developing individuals manually intercepting a moving object with one hand take into consideration the item’s spatio-temporal cues, by adjusting their interceptive movements according to the object’s speed. Finally, the design and validation of MultiTab show its utility in the evaluation of multisensory processing such as the manual localization of audiovisual spatialized stimuli. Overall, findings from this thesis contribute to a more in-depth picture of how the human brain represents space, time, and motion through different senses. Moreover, they provide promising implications in exploring novel technological methods for the assessment and training of these dimensions in typical and atypical populations
    • 

    corecore