3,471 research outputs found

    Linking pain and the body: neural correlates of visually induced analgesia

    Get PDF
    The visual context of seeing the body can reduce the experience of acute pain, producing a multisensory analgesia. Here we investigated the neural correlates of this “visually induced analgesia” using fMRI. We induced acute pain with an infrared laser while human participants looked either at their stimulated right hand or at another object. Behavioral results confirmed the expected analgesic effect of seeing the body, while fMRI results revealed an associated reduction of laser-induced activity in ipsilateral primary somatosensory cortex (SI) and contralateral operculoinsular cortex during the visual context of seeing the body. We further identified two known cortical networks activated by sensory stimulation: (1) a set of brain areas consistently activated by painful stimuli (the so-called “pain matrix”), and (2) an extensive set of posterior brain areas activated by the visual perception of the body (“visual body network”). Connectivity analyses via psychophysiological interactions revealed that the visual context of seeing the body increased effective connectivity (i.e., functional coupling) between posterior parietal nodes of the visual body network and the purported pain matrix. Increased connectivity with these posterior parietal nodes was seen for several pain-related regions, including somatosensory area SII, anterior and posterior insula, and anterior cingulate cortex. These findings suggest that visually induced analgesia does not involve an overall reduction of the cortical response elicited by laser stimulation, but is consequent to the interplay between the brain's pain network and a posterior network for body perception, resulting in modulation of the experience of pain

    "Feeling" others' painful actions: the sensorimotor integration of pain and action information.

    Get PDF
    Sensorimotor regions of the brain have been implicated in simulation processes such as action understanding and empathy, but their functional role in these processes remains unspecified. We used functional magnetic resonance imaging (fMRI) to demonstrate that postcentral sensorimotor cortex integrates action and object information to derive the sensory outcomes of observed hand-object interactions. When subjects viewed others' hands grasping or withdrawing from objects that were either painful or nonpainful, distinct sensorimotor subregions emerged as showing preferential responses to different aspects of the stimuli: object information (noxious vs. innocuous), action information (grasps vs. withdrawals), and painful action outcomes (painful grasps vs. all other conditions). Activation in the latter region correlated with subjects' ratings of how painful each object would be to touch and their previous experience with the object. Viewing others' painful grasps also biased behavioral responses to actual tactile stimulation, a novel effect not seen for auditory control stimuli. Somatosensory cortices, including primary somatosensory areas 1/3b and 2 and parietal area PF, may therefore subserve somatomotor simulation processes by integrating action and object information to anticipate the sensory consequences of observed hand-object interactions

    Damage to the right insula disrupts the perception of affective touch

    Get PDF
    © 2020 Kirsch et al. This article is distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use and redistribution provided that the original author and source are credited.Specific, peripheral C-tactile afferents contribute to the perception of tactile pleasure, but the brain areas involved in their processing remain debated. We report the first human lesion study on the perception of C-tactile touch in right hemisphere stroke patients (N = 59), revealing that right posterior and anterior insula lesions reduce tactile, contralateral and ipsilateral pleasantness sensitivity, respectively. These findings corroborate previous imaging studies regarding the role of the posterior insula in the perception of affective touch. However, our findings about the crucial role of the anterior insula for ipsilateral affective touch perception open new avenues of enquiry regarding the cortical organization of this tactile system.Peer reviewe

    Human brain activity related to the tactile perception of stickiness

    Get PDF
    While the perception of stickiness serves as one of the fundamental dimensions for tactile sensation, little has been elucidated about the stickiness sensation and its neural correlates. The present study investigated how the human brain responds to perceived tactile sticky stimuli using functional magnetic resonance imaging (fMRI). To evoke tactile perception of stickiness with multiple intensities, we generated silicone stimuli with varying catalyst ratios. Also, an acrylic sham stimulus was prepared to present a condition with no sticky sensation. From the two psychophysics experiments-the methods of constant stimuli and the magnitude estimation—we could classify the silicone stimuli into two groups according to whether a sticky perception was evoked: the Supra-threshold group that evoked sticky perception and the Infra-threshold group that did not. In the Supra-threshold vs. Sham contrast analysis of the fMRI data using the general linear model (GLM), the contralateral primary somatosensory area (S1) and ipsilateral dorsolateral prefrontal cortex (DLPFC) showed significant activations in subjects, whereas no significant result was found in the Infra-threshold vs. Sham contrast. This result indicates that the perception of stickiness not only activates the somatosensory cortex, but also possibly induces higher cognitive processes. Also, the Supra- vs. Infra-threshold contrast analysis revealed significant activations in several subcortical regions, including the pallidum, putamen, caudate and thalamus, as well as in another region spanning the insula and temporal cortices. These brain regions, previously known to be related to tactile discrimination, may subserve the discrimination of different intensities of tactile stickiness. The present study unveils the human neural correlates of the tactile perception of stickiness and may contribute to broadening the understanding of neural mechanisms associated with tactile perception.ope

    Reading the mind in the touch: Neurophysiological specificity in the communication of emotions by touch

    Get PDF
    Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.Touch is central to interpersonal interactions. Touch conveys specific emotions about the touch provider, but it is not clear whether this is a purely socially learned function or whether it has neurophysiological specificity. In two experiments with healthy participants (N = 76 and 61) and one neuropsychological single case study, we investigated whether a type of touch characterised by peripheral and central neurophysiological specificity, namely the C tactile (CT) system, can communicate specific emotions and mental states. We examined the specificity of emotions elicited by touch delivered at CT-optimal (3 cm/s) and CT-suboptimal (18 cm/s) velocities (Experiment 1) at different body sites which contain (forearm) vs. do not contain (palm of the hand) CT fibres (Experiment 2). Blindfolded participants were touched without any contextual cues, and were asked to identify the touch provider's emotion and intention. Overall, CT-optimal touch (slow, gentle touch on the forearm) was significantly more likely than other types of touch to convey arousal, lust or desire. Affiliative emotions such as love and related intentions such as social support were instead reliably elicited by gentle touch, irrespective of CT-optimality, suggesting that other top-down factors contribute to these aspects of tactile social communication. To explore the neural basis of this communication, we also tested this paradigm in a stroke patient with right perisylvian damage, including the posterior insular cortex, which is considered as the primary cortical target of CT afferents, but excluding temporal cortex involvement that has been linked to more affiliative aspects of CT-optimal touch. His performance suggested an impairment in ‘reading’ emotions based on CT-optimal touch. Taken together, our results suggest that the CT system can add specificity to emotional and social communication, particularly with regards to feelings of desire and arousal. On the basis of these findings, we speculate that its primary functional role may be to enhance the ‘sensual salience’ of tactile interactions.Peer reviewedFinal Published versio

    Interoceptive inference, emotion, and the embodied self

    Get PDF
    The concept of the brain as a prediction machine has enjoyed a resurgence in the context of the Bayesian brain and predictive coding approaches within cognitive science. To date, this perspective has been applied primarily to exteroceptive perception (e.g., vision, audition), and action. Here, I describe a predictive, inferential perspective on interoception: ‘interoceptive inference’ conceives of subjective feeling states (emotions) as arising from actively-inferred generative (predictive) models of the causes of interoceptive afferents. The model generalizes ‘appraisal’ theories that view emotions as emerging from cognitive evaluations of physiological changes, and it sheds new light on the neurocognitive mechanisms that underlie the experience of body ownership and conscious selfhood in health and in neuropsychiatric illness

    Multisensory mechanisms of body ownership and self-location

    Get PDF
    Having an accurate sense of the spatial boundaries of the body is a prerequisite for interacting with the environment and is thus essential for the survival of any organism with a central nervous system. Every second, our brain receives a staggering amount of information from the body across different sensory channels, each of which features a certain degree of noise. Despite the complexity of the incoming multisensory signals, the brain manages to construct and maintain a stable representation of our own body and its spatial relationships to the external environment. This natural “in-body” experience is such a fundamental subjective feeling that most of us take it for granted. However, patients with lesions in particular brain areas can experience profound disturbances in their normal sense of ownership over their body (somatoparaphrenia) or lose the feeling of being located inside their physical body (out-of-body experiences), suggesting that our “in-body” experience depends on intact neural circuitry in the temporal, frontal, and parietal brain regions. The question at the heart of this thesis relates to how the brain combines visual, tactile, and proprioceptive signals to build an internal representation of the bodily self in space. Over the past two decades, perceptual body illusions have become an important tool for studying the mechanisms underlying our sense of body ownership and self-location. The most influential of these illusions is the rubber hand illusion, in which ownership of an artificial limb is induced via the synchronous stroking of a rubber hand and an individual’s hidden real hand. Studies of this illusion have shown that multisensory integration within the peripersonal space is a key mechanism for bodily self-attribution. In Study I, we showed that the default sense of ownership of one’s real hand, not just the sense of rubber hand ownership, also depends on spatial and temporal multisensory congruence principles implemented in fronto-parietal brain regions. In Studies II and III, we characterized two novel perceptual illusions that provide strong support for the notion that multisensory integration within the peripersonal space is intimately related to the sense of limb ownership, and we examine the role of vision in this process. In Study IV, we investigated a fullbody version of the rubber hand illusion—the “out-of-body illusion”—and show that it can be used to induce predictable changes in one’s sense of self-location and body ownership. Finally, in Study V, we used the out-of-body illusion to “perceptually teleport” participants during brain imaging and identify activity patterns specific to the sense of self-location in a given position in space. Together, these findings shed light on the role of multisensory integration in building the experience of the bodily self in space and provide initial evidence for how representations of body ownership and self-location interact in the brain
    corecore