161 research outputs found
Rapid enhancement of touch from non-informative vision of the hand
Processing in one sensory modality may modulate processing in another. Here we investigate how simply viewing the hand can influence the sense of touch. Previous studies showed that non-informative vision of the hand enhances tactile acuity, relative to viewing an object at the same location. However, it remains unclear whether this Visual Enhancement of Touch (VET) involves a phasic enhancement of tactile processing circuits triggered by the visual event of seeing the hand, or more prolonged, tonic neuroplastic changes, such as recruitment of additional cortical areas for tactile processing. We recorded somatosensory evoked potentials (SEPs) evoked by electrical stimulation of the right middle finger, both before and shortly after viewing either the right hand, or a neutral object presented via a mirror. Crucially, and unlike prior studies, our visual exposures were unpredictable and brief, in addition to being non-informative about touch. Viewing the hand, as opposed to viewing an object, enhanced tactile spatial discrimination measured using grating orientation judgements, and also the P50 SEP component, which has been linked to early somatosensory cortical processing. This was a trial-specific, phasic effect, occurring within a few seconds of each visual onset, rather than an accumulating, tonic effect. Thus, somatosensory cortical modulation can be triggered even by a brief, non-informative glimpse of one’s hand. Such rapid multisensory modulation reveals novel aspects of the specialised brain systems for functionally representing the body
Recommended from our members
When far is near: ERP correlates of crossmodal spatial interactions between tactile and mirror-reflected visual stimuli
Visuo-tactile integration occurs in a privileged way in peripersonal space, namely when visual and tactile stimuli are in spatial proximity. Here, we investigated whether crossmodal spatial effects (i.e. stronger crossmodal interactions for spatially congruent compared to incongruent visual and tactile stimuli) are also present when visual stimuli presented near the body are indirectly viewed in a mirror, thus appearing in far space. Participants had to attend to one of their hands throughout a block of stimuli in order to detect infrequent tactile target stimuli at that hand while ignoring tactile targets at the unattended hand, all tactile non-target stimuli, and any visual stimuli. Visual stimuli were presented simultaneously with tactile stimuli, in the same (congruent) or opposite (incongruent) hemispace with respect to the tactile stimuli. In one group of participants the visual stimuli were delivered near the participants’ hands and were observed as indirect mirror reflections (‘mirror’ condition), while in the other group these were presented at a distance from the hands (‘far’ condition). The main finding was that crossmodal spatial modulations of ERPs recorded over and close to somatosensory cortex were present in the ‘mirror’ condition but not the ‘far’ condition. That is, ERPs were enhanced in response to tactile stimuli coupled with spatially congruent versus incongruent visual stimuli when the latter were viewed through a mirror. These effects emerged around 190 ms after stimuli onset, and were modulated by the focus of spatial attention. These results provide evidence that visual stimuli observed in far space via a mirror are coded as near-thebody stimuli according to their known rather than to their perceived location. This suggests
that crossmodal interactions between vision and touch may be modulated by previous knowledge of reflecting surfaces (i.e. top-down processing)
Representation of Neck Velocity and Neck–Vestibular Interactions in Pursuit Neurons in the Simian Frontal Eye Fields
The smooth pursuit system must interact with the vestibular system to maintain the accuracy of eye movements in space (i.e., gaze-movement) during head movement. Normally, the head moves on the stationary trunk. Vestibular signals cannot distinguish whether the head or whole body is moving. Neck proprioceptive inputs provide information about head movements relative to the trunk. Previous studies have shown that the majority of pursuit neurons in the frontal eye fields (FEF) carry visual information about target velocity, vestibular information about whole-body movements, and signal eye- or gaze-velocity. However, it is unknown whether FEF neurons carry neck proprioceptive signals. By passive trunk-on-head rotation, we tested neck inputs to FEF pursuit neurons in 2 monkeys. The majority of FEF pursuit neurons tested that had horizontal preferred directions (87%) responded to horizontal trunk-on-head rotation. The modulation consisted predominantly of velocity components. Discharge modulation during pursuit and trunk-on-head rotation added linearly. During passive head-on-trunk rotation, modulation to vestibular and neck inputs also added linearly in most neurons, although in half of gaze-velocity neurons neck responses were strongly influenced by the context of neck rotation. Our results suggest that neck inputs could contribute to representing eye- and gaze-velocity FEF signals in trunk coordinates
Recommended from our members
The spatial distance rule in the moving and classical rubber hand illusions
The rubber hand illusion (RHI) is a perceptual illusion in which participants perceive a model hand as part of their own body. Here, through the use of one questionnaire experiment and two proprioceptive drift experiments, we investigated the effect of distance (12, 27.5, and 43 cm) in the vertical plane on both the moving and classical RHI. In both versions of the illusion, we found an effect of distance on ownership of the rubber hand for both measures tested. Our results further suggested that the moving RHI might follow a narrower spatial rule. Finally, whereas ownership of the moving rubber hand was affected by distance, this was not the case for agency, which was present at all distances tested. In sum, the present results generalize the spatial distance rule in terms of ownership to the vertical plane of space and demonstrate that also the moving RHI obeys this rule
Fronto-parietal brain responses to visuotactile congruence in an anatomical reference frame
Spatially and temporally congruent visuotactile stimulation of a fake hand
together with one’s real hand may result in an illusory self-attribution of
the fake hand. Although this illusion relies on a representation of the two
touched body parts in external space, there is tentative evidence that, for
the illusion to occur, the seen and felt touches also need to be congruent in
an anatomical reference frame. We used functional magnetic resonance imaging
and a somatotopical, virtual reality-based setup to isolate the neuronal basis
of such a comparison. Participants’ index or little finger was synchronously
touched with the index or little finger of a virtual hand, under congruent or
incongruent orientations of the real and virtual hands. The left ventral
premotor cortex responded significantly more strongly to visuotactile co-
stimulation of the same versus different fingers of the virtual and real hand.
Conversely, the left anterior intraparietal sulcus responded significantly
more strongly to co-stimulation of different versus same fingers. Both
responses were independent of hand orientation congruence and of spatial
congruence of the visuotactile stimuli. Our results suggest that fronto-
parietal areas previously associated with multisensory processing within
peripersonal space and with tactile remapping evaluate the congruence of
visuotactile stimulation on the body according to an anatomical reference
frame
Multisensory Integration in Self Motion Perception
Self motion perception involves the integration of visual, vestibular, somatosensory and motor signals. This article reviews the findings from single unit electrophysiology, functional and structural magnetic resonance imaging and psychophysics to present an update on how the human and non-human primate brain integrates multisensory information to estimate one's position and motion in space. The results indicate that there is a network of regions in the non-human primate and human brain that processes self motion cues from the different sense modalities
Multiple foci of spatial attention in multimodal working memory
The maintenance of sensory information in working memory (WM) is mediated by the attentional activation of stimulus representations that are stored in perceptual brain regions.Using event-related potentials (ERPs), we measured tactile and visual contralateral delay activity (tCDA / CDA components) in a bimodal WM task to concurrently track the attention-based maintenance of information stored in anatomically segregated (somatosensory and visual) brain areas. Participants received tactile and visual sample stimuli on both sides, and in different blocks, memorized these samples on the same side or on opposite sides. After a retention delay, memory was unpredictably tested for touch or vision. In same side blocks, tCDA and CDA components simultaneously emerged over the same hemisphere, contralateral to the memorized tactile / visual sample set. In opposite side blocks, these two components emerged over different hemispheres, but had the same sizes and onset latencies as in the same side condition. This finding indicates that distinct foci of tactile and visual spatial attention were concurrently maintained on task-relevant stimulus representations in WM. The independence of spatially-specific biasing mechanisms for tactile and visual WM content suggests that multimodal information is stored in distributed perceptual brain areas that are subject to modality-specific control processes, which can operate simultaneously and largely independently of each other
If I Were You: Perceptual Illusion of Body Swapping
The concept of an individual swapping his or her body with that of another person has captured the imagination of writers and artists for decades. Although this topic has not been the subject of investigation in science, it exemplifies the fundamental question of why we have an ongoing experience of being located inside our bodies. Here we report a perceptual illusion of body-swapping that addresses directly this issue. Manipulation of the visual perspective, in combination with the receipt of correlated multisensory information from the body was sufficient to trigger the illusion that another person's body or an artificial body was one's own. This effect was so strong that people could experience being in another person's body when facing their own body and shaking hands with it. Our results are of fundamental importance because they identify the perceptual processes that produce the feeling of ownership of one's body
The free-energy self:A predictive coding account of self-recognition
Recognising and representing one's self as distinct from others is a fundamental component of self-awareness. However, current theories of self-recognition are not embedded within global theories of cortical function and therefore fail to provide a compelling explanation of how the self is processed. We present a theoretical account of the neural and computational basis of self-recognition that is embedded within the free-energy account of cortical function. In this account one's body is processed in a Bayesian manner as the most likely to be "me". Such probabilistic representation arises through the integration of information from hierarchically organised unimodal systems in higher-level multimodal areas. This information takes the form of bottom-up "surprise" signals from unimodal sensory systems that are explained away by top-down processes that minimise the level of surprise across the brain. We present evidence that this theoretical perspective may account for the findings of psychological and neuroimaging investigations into self-recognition and particularly evidence that representations of the self are malleable, rather than fixed as previous accounts of self-recognition might suggest
Does the Integration of Haptic and Visual Cues Reduce the Effect of a Biased Visual Reference Frame on the Subjective Head Orientation?
The selection of appropriate frames of reference (FOR) is a key factor in the elaboration of spatial perception and the production of robust interaction with our environment. The extent to which we perceive the head axis orientation (subjective head orientation, SHO) with both accuracy and precision likely contributes to the efficiency of these spatial interactions. A first goal of this study was to investigate the relative contribution of both the visual and egocentric FOR (centre-of-mass) in the SHO processing. A second goal was to investigate humans' ability to process SHO in various sensory response modalities (visual, haptic and visuo-haptic), and the way they modify the reliance to either the visual or egocentric FORs. A third goal was to question whether subjects combined visual and haptic cues optimally to increase SHO certainty and to decrease the FORs disruption effect.Thirteen subjects were asked to indicate their SHO while the visual and/or egocentric FORs were deviated. Four results emerged from our study. First, visual rod settings to SHO were altered by the tilted visual frame but not by the egocentric FOR alteration, whereas no haptic settings alteration was observed whether due to the egocentric FOR alteration or the tilted visual frame. These results are modulated by individual analysis. Second, visual and egocentric FOR dependency appear to be negatively correlated. Third, the response modality enrichment appears to improve SHO. Fourth, several combination rules of the visuo-haptic cues such as the Maximum Likelihood Estimation (MLE), Winner-Take-All (WTA) or Unweighted Mean (UWM) rule seem to account for SHO improvements. However, the UWM rule seems to best account for the improvement of visuo-haptic estimates, especially in situations with high FOR incongruence. Finally, the data also indicated that FOR reliance resulted from the application of UWM rule. This was observed more particularly, in the visual dependent subject. Conclusions: Taken together, these findings emphasize the importance of identifying individual spatial FOR preferences to assess the efficiency of our interaction with the environment whilst performing spatial tasks
- …