699 research outputs found

    Visual enhancement of touch and the bodily self

    Get PDF
    We experience our own body through both touch and vision. We further see that others’ bodies are similar to our own body, but we have no direct experience of touch on others’ bodies. Therefore, relations between vision and touch are important for the sense of self and for mental representation of one’s own body. For example, seeing the hand improves tactile acuity on the hand, compared to seeing a non-hand object. While several studies have demonstrated this visual enhancement of touch (VET) effect, its relation to the ‘bodily self’, or mental representation of one’s own body remains unclear. We examined whether VET is an effect of seeing a hand, or of seeing my hand, using the rubber hand illusion. In this illusion, a prosthetic hand which is brushed synchronously—but not asynchronously—with one’s own hand is felt to actually be one’s hand. Thus, we manipulated whether or not participants felt like they were looking directly at their hand, while holding the actual stimulus they viewed constant. Tactile acuity was measured by having participants judge the orientation of square-wave gratings. Two characteristic effects of VET were observed: (1) cross-modal enhancement from seeing the hand was inversely related to overall tactile acuity, and (2) participants near sensory threshold showed significant improvement following synchronous stroking, compared to asynchronous stroking or no stroking at all. These results demonstrate a clear functional relation between the bodily self and basic tactile perception

    Rapid enhancement of touch from non-informative vision of the hand

    Get PDF
    Processing in one sensory modality may modulate processing in another. Here we investigate how simply viewing the hand can influence the sense of touch. Previous studies showed that non-informative vision of the hand enhances tactile acuity, relative to viewing an object at the same location. However, it remains unclear whether this Visual Enhancement of Touch (VET) involves a phasic enhancement of tactile processing circuits triggered by the visual event of seeing the hand, or more prolonged, tonic neuroplastic changes, such as recruitment of additional cortical areas for tactile processing. We recorded somatosensory evoked potentials (SEPs) evoked by electrical stimulation of the right middle finger, both before and shortly after viewing either the right hand, or a neutral object presented via a mirror. Crucially, and unlike prior studies, our visual exposures were unpredictable and brief, in addition to being non-informative about touch. Viewing the hand, as opposed to viewing an object, enhanced tactile spatial discrimination measured using grating orientation judgements, and also the P50 SEP component, which has been linked to early somatosensory cortical processing. This was a trial-specific, phasic effect, occurring within a few seconds of each visual onset, rather than an accumulating, tonic effect. Thus, somatosensory cortical modulation can be triggered even by a brief, non-informative glimpse of one’s hand. Such rapid multisensory modulation reveals novel aspects of the specialised brain systems for functionally representing the body

    Observing another in pain facilitates vicarious experiences and modulates somatosensory experiences

    Get PDF
    Objective: This study investigated whether individuals reporting vicarious pain in daily life (e.g., the self-reported vicarious pain group) display vicarious experiences during an experimental paradigm, and also show an improved detection of somatosensory stimuli while observing another in pain. Furthermore, this study investigated the stability of these phenomena. Finally, this study explored the putative modulating role of dispositional empathy and hypervigilance for pain. Methods: Vicarious pain responders (i.e., reporting vicarious pain in daily life; N = 16) and controls (N = 19) were selected from a large sample, and viewed videos depicting pain-related (hands being pricked) and non-pain related scenes, whilst occasionally experiencing vibrotactile stimuli themselves on the left, right or both hands. Participants reported the location at which they felt a somatosensory stimulus. We calculated the number of vicarious errors (i.e., the number of trials in which an illusionary sensation was reported while observing pain-related scenes) and detection accuracy. Thirty-three participants (94.29%) took part in the same experiment 5 months later to investigate the temporal stability of the outcomes. Results: The vicarious pain group reported more vicarious errors compared with controls and this effect proved to be stable over time. Detection was facilitated while observing pain-related scenes compared with non-pain related scenes. Observers' characteristics, i.e., dispositional empathy and hypervigilance for pain, did not modulate the effects. Conclusion: Observing pain facilitates the detection of tactile stimuli, both in vicarious pain responders and controls. Interestingly, vicarious pain responders reported more vicarious errors during the experimental paradigm compared to controls and this effect remained stable over time

    Seeing pain and pleasure on self and others: behavioural and psychophysiological reactivity in immersive virtual reality

    Get PDF
    Studies have explored behavioral and neural responses to the observation of pain in others. However, much less is known about how taking a physical perspective influences reactivity to the observation of others' pain and pleasure. To explore this issue we devised a novel paradigm in which 24 healthy participants immersed in a virtual reality scenario observed a virtual: needle penetrating (pain), caress (pleasure), or ball touching (neutral) the hand of an avatar seen from a first (1PP)- or a third (3PP)-person perspective. Subjective ratings and physiological responses [skin conductance responses (SCR) and heart rate (HR)] were collected in each trial. All participants reported strong feelings of ownership of the virtual hand only in 1PP. Subjective measures also showed that pain and pleasure were experienced as more salient than neutral. SCR analysis demonstrated higher reactivity in 1PP than in 3PP. Importantly, vicarious pain induced stronger responses with respect to the other conditions in both perspectives. HR analysis revealed equally lower activity during pain and pleasure with respect to neutral. SCR may reflect egocentric perspective, and HR may merely index general arousal. The results suggest that behavioral and physiological indexes of reactivity to seeing others' pain and pleasure were qualitatively similar in 1PP and 3PP. Our paradigm indicates that virtual reality can be used to study vicarious sensation of pain and pleasure without actually delivering any stimulus to participants' real body and to explore behavioral and physiological reactivity when they observe pain and pleasure from ego- and allocentric perspectives

    Cooling the Thermal Grill Illusion through Self-Touch

    Get PDF
    Acute peripheral pain is reduced by multisensory interactions at the spinal level. Central pain is reduced by reorganization of cortical body representations. We show here that acute pain can also be reduced by multisensory integration through self-touch, which provides proprioceptive, thermal, and tactile input forming a coherent body representation. We combined self-touch with the thermal grill illusion (TGI). In the traditional TGI, participants press their fingers on two warm objects surrounding one cool object. The warm surround unmasks pain pathways, which paradoxically causes the cool object to feel painfully hot. Here, we warmed the index and ring fingers of each hand while cooling the middle fingers. Immediately after, these three fingers of the right hand were touched against the same three fingers on the left hand. This self-touch caused a dramatic 64% reduction in perceived heat. We show that this paradoxical release from paradoxical heat cannot be explained by low-level touch-temperature interactions alone. To reduce pain, we often clutch a painful hand with the other hand. We show here that self-touch not only gates pain signals reaching the brain but also, via multisensory integration, increases coherence of cognitive body representations to which pain afferents project

    Interpersonal representations of touch in somatosensory cortex are modulated by perspective

    Get PDF
    Observing others being touched activates similar brain areas as those activated when one experiences a touch oneself. Event-related potential (ERP) studies have revealed that modulation of somatosensory components by observed touch occurs within 100 ms after stimulus onset, and such vicarious effects have been taken as evidence for empathy for others' tactile experiences. In previous studies body parts have been presented from a first person perspective. This raises the question of the extent to which somatosensory activation by observed touch to body parts depends on the perspective from which the body part is observed. In this study (N = 18), we examined the modulation of somatosensory ERPs by observed touch delivered to another person's hand when viewed as if from a first person versus a third person perspective. We found that vicarious touch effects primarily consist of two separable components in the early stages of somatosensory processing: an anatomical mapping for touch in first person perspective at P45, and a specular (mirror like) mapping for touch in third person perspective at P100. This is consistent with suggestions that vicarious representations exist to support predictions for one's own bodily events, but also to enable predictions of a social or interpersonal kind, at distinct temporal stages

    Distinct contributions of Brodmann areas 1 and 2 to body ownership

    Get PDF
    Although body ownership—i.e. the feeling that our bodies belong to us—modulates activity within the primary somatosensory cortex (S1), it is still unknown whether this modulation occurs within a somatotopically defined portion of S1. We induced an illusory feeling of ownership for another person's finger by asking participants to hold their palm against another person's palm and to stroke the two joined index fingers with the index and thumb of their other hand. This illusion (numbness illusion) does not occur if the stroking is performed asynchronously or by the other person. We combined this somatosensory paradigm with ultra-high field functional magnetic resonance imaging finger mapping to study whether illusory body ownership modulates activity within different finger-specific areas of S1. The results revealed that the numbness illusion is associated with activity in Brodmann area (BA) 1 within the representation of the finger stroking the other person's finger and in BA 2 contralateral to the stroked finger. These results show that changes in bodily experience modulate the activity within certain subregions of S1, with a different finger-topographical selectivity between the representations of the stroking and of the stroked hand, and reveal that the high degree of somatosensory specialization in S1 extends to bodily self-consciousnes

    Neural Substrates of Reliability-Weighted Visual-Tactile Multisensory Integration

    Get PDF
    As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting

    Visual-somatosensory interactions in mental representations of the body and the face

    Get PDF
    The body is represented in the brain at levels that incorporate multisensory information. This thesis focused on interactions between vision and cutaneous sensations (i.e., touch and pain). Experiment 1 revealed that there are partially dissociable pathways for visual enhancement of touch (VET) depending upon whether one sees one’s own body or the body of another person. This indicates that VET, a seeming low-level effect on spatial tactile acuity, is actually sensitive to body identity. Experiments 2-4 explored the effect of viewing one’s own body on pain perception. They demonstrated that viewing the body biases pain intensity judgments irrespective of actual stimulus intensity, and, more importantly, reduces the discriminative capacities of the nociceptive pathway encoding noxious stimulus intensity. The latter effect only occurs if the pain-inducing event itself is not visible, suggesting that viewing the body alone and viewing a stimulus event on the body have distinct effects on cutaneous sensations. Experiment 5 replicated an enhancement of visual remapping of touch (VRT) when viewing fearful human faces being touched, and further demonstrated that VRT does not occur for observed touch on non-human faces, even fearful ones. This suggests that the facial expressions of non-human animals may not be simulated within the somatosensory system of the human observer in the same way that the facial expressions of other humans are. Finally, Experiment 6 examined the enfacement illusion, in which synchronous visuo-tactile inputs cause another’s face to be assimilated into the mental self-face representation. The strength of enfacement was not affected by the other’s facial expression, supporting an asymmetric relationship between processing of facial identity and facial expressions. Together, these studies indicate that multisensory representations of the body in the brain link low-level perceptual processes with the perception of emotional cues and body/face identity, and interact in complex ways depending upon contextual factors
    • 

    corecore