82 research outputs found
Recommended from our members
Teaching the Blind to Find Their Way by Playing Video Games
Computer based video games are receiving great interest as a means to learn and acquire new skills. As a novel approach to teaching navigation skills in the blind, we have developed Audio-based Environment Simulator (AbES); a virtual reality environment set within the context of a video game metaphor. Despite the fact that participants were naïve to the overall purpose of the software, we found that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building using audio based cues alone. This was confirmed by a series of behavioral performance tests designed to assess the transfer of acquired spatial information to a large-scale, real-world indoor navigation task. Furthermore, learning the spatial layout through a goal directed gaming strategy allowed for the mental manipulation of spatial information as evidenced by enhanced navigation performance when compared to an explicit route learning strategy. We conclude that the immersive and highly interactive nature of the software greatly engages the blind user to actively explore the virtual environment. This in turn generates an accurate sense of a large-scale three-dimensional space and facilitates the learning and transfer of navigation skills to the physical world
The effect of musical expertise on the representation of space
Consistent evidence suggests that pitch height may be represented in a spatial format, having both a vertical and a horizontal representation. The spatial representation of pitch height results into response compatibility effects for which high pitch tones are preferentially associated to up-right responses, and low pitch tones are preferentially associated to down-left responses (i.e., the Spatial-Musical Association of Response Codes (SMARC) effect), with the strength of these associations depending on individuals’ musical skills. In this study we investigated whether listening to tones of different pitch affects the representation of external space, as assessed in a visual and haptic line bisection paradigm, in musicians and non musicians. Low and high pitch tones affected the bisection performance in musicians differently, both when pitch was relevant and irrelevant for the task, and in both the visual and the haptic modality. No effect of pitch height was observed on the bisection performance of non musicians. Moreover, our data also show that musicians present a (supramodal) rightward bisection bias in both the visual and the haptic modality, extending previous findings limited to the visual modality, and consistent with the idea that intense practice with musical notation and bimanual instrument training affects hemispheric lateralization
Virtual environments for the transfer of navigation skills in the blind: a comparison of directed instruction vs. video game based learning approaches
For profoundly blind individuals, navigating in an unfamiliar building can represent a significant challenge. We investigated the use of an audio-based, virtual environment called Audio-based Environment Simulator (AbES) that can be explored for the purposes of learning the layout of an unfamiliar, complex indoor environment. Furthermore, we compared two modes of interaction with AbES. In one group, blind participants implicitly learned the layout of a target environment while playing an exploratory, goal-directed video game. By comparison, a second group was explicitly taught the same layout following a standard route and instructions provided by a sighted facilitator. As a control, a third group interacted with AbES while playing an exploratory, goal-directed video game however, the explored environment did not correspond to the target layout. Following interaction with AbES, a series of route navigation tasks were carried out in the virtual and physical building represented in the training environment to assess the transfer of acquired spatial information. We found that participants from both modes of interaction were able to transfer the spatial knowledge gained as indexed by their successful route navigation performance. This transfer was not apparent in the control participants. Most notably, the game-based learning strategy was also associated with enhanced performance when participants were required to find alternate routes and short cuts within the target building suggesting that a ludic-based training approach may provide for a more flexible mental representation of the environment. Furthermore, outcome comparisons between early and late blind individuals suggested that greater prior visual experience did not have a significant effect on overall navigation performance following training. Finally, performance did not appear to be associated with other factors of interest such as age, gender, and verbal memory recall. We conclude that the highly interactive and immersive exploration of the virtual environment greatly engages a blind user to develop skills akin to positive near transfer of learning. Learning through a game play strategy appears to confer certain behavioral advantages with respect to how spatial information is acquired and ultimately manipulated for navigation
Action video game play and transfer of navigation and spatial cognition skills in adolescents who are blind
For individuals who are blind, navigating independently in an unfamiliar environment represents a considerable challenge. Inspired by the rising popularity of video games, we have developed a novel approach to train navigation and spatial cognition skills in adolescents who are blind. Audio-based Environment Simulator (AbES) is a software application that allows for the virtual exploration of an existing building set in an action video game metaphor. Using this ludic-based approach to learning, we investigated the ability and efficacy of adolescents with early onset blindness to acquire spatial information gained from the exploration of a target virtual indoor environment. Following game play, participants were assessed on their ability to transfer and mentally manipulate acquired spatial information on a set of navigation tasks carried out in the real environment. Success in transfer of navigation skill performance was markedly high suggesting that interacting with AbES leads to the generation of an accurate spatial mental representation. Furthermore, there was a positive correlation between success in game play and navigation task performance. The role of virtual environments and gaming in the development of mental spatial representations is also discussed. We conclude that this game based learning approach can facilitate the transfer of spatial knowledge and further, can be used by individuals who are blind for the purposes of navigation in real-world environments
Neuroplasticity Associated with Tactile Language Communication in a Deaf-Blind Subject
A long-standing debate in cognitive neuroscience pertains to the innate nature of language development and the underlying factors that determine this faculty. We explored the neural correlates associated with language processing in a unique individual who is early blind, congenitally deaf, and possesses a high level of language function. Using functional magnetic resonance imaging (fMRI), we compared the neural networks associated with the tactile reading of words presented in Braille, Print on Palm (POP), and a haptic form of American Sign Language (haptic ASL or hASL). With all three modes of tactile communication, indentifying words was associated with robust activation within occipital cortical regions as well as posterior superior temporal and inferior frontal language areas (lateralized within the left hemisphere). In a normally sighted and hearing interpreter, identifying words through hASL was associated with left-lateralized activation of inferior frontal language areas however robust occipital cortex activation was not observed. Diffusion tensor imaging -based tractography revealed differences consistent with enhanced occipital-temporal connectivity in the deaf-blind subject. Our results demonstrate that in the case of early onset of both visual and auditory deprivation, tactile-based communication is associated with an extensive cortical network implicating occipital as well as posterior superior temporal and frontal associated language areas. The cortical areas activated in this deaf-blind subject are consistent with characteristic cortical regions previously implicated with language. Finally, the resilience of language function within the context of early and combined visual and auditory deprivation may be related to enhanced connectivity between relevant cortical areas
Recommended from our members
Tuning and disrupting the brain—modulating the McGurk illusion with electrical stimulation
In the so-called McGurk illusion, when the synchronized presentation of the visual stimulus /ga/ is paired with the auditory stimulus /ba/, people in general hear it as /da/. Multisensory integration processing underlying this illusion seems to occur within the Superior Temporal Sulcus (STS). Herein, we present evidence demonstrating that bilateral cathodal transcranial direct current stimulation (tDCS) of this area can decrease the McGurk illusion-type responses. Additionally, we show that the manipulation of this audio-visual integrated output occurs irrespective of the number of eye-fixations on the mouth of the speaker. Bilateral anodal tDCS of the Parietal Cortex also modulates the illusion, but in the opposite manner, inducing more illusion-type responses. This is the first demonstration of using non-invasive brain stimulation to modulate multisensory speech perception in an illusory context (i.e., both increasing and decreasing illusion-type responses to a verbal audio-visual integration task). These findings provide clear evidence that both the superior temporal and parietal areas contribute to multisensory integration processing related to speech perception. Specifically, STS seems fundamental for the temporal synchronization and integration of auditory and visual inputs. For its part, posterior parietal cortex (PPC) may adjust the arrival of incoming audio and visual information to STS thereby enhancing their interaction in this latter area
Recommended from our members
Effects of Sensory Behavioral Tasks on Pain Threshold and Cortical Excitability
Background/objective: Transcutaneous electrical stimulation has been proven to modulate nervous system activity, leading to changes in pain perception, via the peripheral sensory system, in a bottom up approach. We tested whether different sensory behavioral tasks induce significant effects in pain processing and whether these changes correlate with cortical plasticity. Methodology/principal findings: This randomized parallel designed experiment included forty healthy right-handed males. Three different somatosensory tasks, including learning tasks with and without visual feedback and simple somatosensory input, were tested on pressure pain threshold and motor cortex excitability using transcranial magnetic stimulation (TMS). Sensory tasks induced hand-specific pain modulation effects. They increased pain thresholds of the left hand (which was the target to the sensory tasks) and decreased them in the right hand. TMS showed that somatosensory input decreased cortical excitability, as indexed by reduced MEP amplitudes and increased SICI. Although somatosensory tasks similarly altered pain thresholds and cortical excitability, there was no significant correlation between these variables and only the visual feedback task showed significant somatosensory learning. Conclusions/significance: Lack of correlation between cortical excitability and pain thresholds and lack of differential effects across tasks, but significant changes in pain thresholds suggest that analgesic effects of somatosensory tasks are not primarily associated with motor cortical neural mechanisms, thus, suggesting that subcortical neural circuits and/or spinal cord are involved with the observed effects. Identifying the neural mechanisms of somatosensory stimulation on pain may open novel possibilities for combining different targeted therapies for pain control
Perception and discrimination of real-life emotional vocalizations in early blind individuals
IntroductionThe capacity to understand others’ emotions and react accordingly is a key social ability. However, it may be compromised in case of a profound sensory loss that limits the contribution of available contextual cues (e.g., facial expression, gestures, body posture) to interpret emotions expressed by others. In this study, we specifically investigated whether early blindness affects the capacity to interpret emotional vocalizations, whose valence may be difficult to recognize without a meaningful context.MethodsWe asked a group of early blind (N = 22) and sighted controls (N = 22) to evaluate the valence and the intensity of spontaneous fearful and joyful non-verbal vocalizations.ResultsOur data showed that emotional vocalizations presented alone (i.e., with no contextual information) are similarly ambiguous for blind and sighted individuals but are perceived as more intense by the former possibly reflecting their higher saliency when visual experience is unavailable.DisussionOur study contributes to a better understanding of how sensory experience shapes ememotion recognition
Recommended from our members
Characterizing Visual Field Deficits in Cerebral/Cortical Visual Impairment (CVI) Using Combined Diffusion Based Imaging and Functional Retinotopic Mapping: A Case Study
Recommended from our members
Cerebral versus Ocular Visual Impairment: The Impact on Developmental Neuroplasticity
Cortical/cerebral visual impairment (CVI) is clinically defined as significant visual dysfunction caused by injury to visual pathways and structures occurring during early perinatal development. Depending on the location and extent of damage, children with CVI often present with a myriad of visual deficits including decreased visual acuity and impaired visual field function. Most striking, however, are impairments in visual processing and attention which have a significant impact on learning, development, and independence. Within the educational arena, current evidence suggests that strategies designed for individuals with ocular visual impairment are not effective in the case of CVI. We propose that this variance may be related to differences in compensatory neuroplasticity related to the type of visual impairment, as well as underlying alterations in brain structural connectivity. We discuss the etiology and nature of visual impairments related to CVI, and how advanced neuroimaging techniques (i.e., diffusion-based imaging) may help uncover differences between ocular and cerebral causes of visual dysfunction. Revealing these differences may help in developing future strategies for the education and rehabilitation of individuals living with visual impairment
- …