11,313 research outputs found

    Illusory resizing of the painful knee is analgesic in symptomatic knee osteoarthritis

    Get PDF
    Background. Experimental and clinical evidence support a link between body represen- tations and pain. This proof-of-concept study in people with painful knee osteoarthritis (OA) aimed to determine if: (i) visuotactile illusions that manipulate perceived knee size are analgesic; (ii) cumulative analgesic effects occur with sustained or repeated illusions. Methods. Participants with knee OA underwent eight conditions (order randomised): stretch and shrink visuotactile (congruent) illusions and corresponding visual, tactile and incongruent control conditions. Knee pain intensity (0-100 numerical rating scale; 0 = no pain at all and 100 D worst pain imaginable) was assessed pre- and post- condition. Condition (visuotactile illusion vs control) × Time (pre-/post-condition) repeated measure ANOVAs evaluated the effect on pain. In each participant, the most beneficial illusion was sustained for 3 min and was repeated 10 times (each during two sessions); paired t -tests compared pain at time 0 and 180s (sustained) and between illusion 1 and illusion 10 (repeated). Results. Visuotactile illusions decreased pain by an average of 7.8 points (95% CI [2.0-13.5]) which corresponds to a 25% reduction in pain, but the tactile only and visual only control conditions did not (Condition × Time interaction: p = 0:028). Visuotactile illusions did not differ from incongruent control conditions where the same visual manipulation occurred, but did differ when only the same tactile input was applied. Sustained illusions prolonged analgesia, but did not increase it. Repeated illusions increased the analgesic effect with an average pain decrease of 20 points (95% CI [6.9-33.1])-corresponding to a 40% pain reduction. Discussion. Visuotactile illusions are analgesic in people with knee OA. Our results suggest that visual input plays a critical role in pain relief, but that analgesia requires multisensory input. That visual and tactile input is needed for analgesia, supports multisensory modulation processes as a possible explanatory mechanism. Further research exploring the neural underpinnings of these visuotactile illusions is needed. For potential clinical applications, future research using a greater dosage in larger samples is warranted

    Drifting perceptual patterns suggest prediction errors fusion rather than hypothesis selection: replicating the rubber-hand illusion on a robot

    Full text link
    Humans can experience fake body parts as theirs just by simple visuo-tactile synchronous stimulation. This body-illusion is accompanied by a drift in the perception of the real limb towards the fake limb, suggesting an update of body estimation resulting from stimulation. This work compares body limb drifting patterns of human participants, in a rubber hand illusion experiment, with the end-effector estimation displacement of a multisensory robotic arm enabled with predictive processing perception. Results show similar drifting patterns in both human and robot experiments, and they also suggest that the perceptual drift is due to prediction error fusion, rather than hypothesis selection. We present body inference through prediction error minimization as one single process that unites predictive coding and causal inference and that it is responsible for the effects in perception when we are subjected to intermodal sensory perturbations.Comment: Proceedings of the 2018 IEEE International Conference on Development and Learning and Epigenetic Robotic

    Bodily Illusions Modulate Tactile Perception

    Get PDF
    Touch differs from other exteroceptive senses in that the body itself forms part of the tactile percept. Interactions between proprioception and touch provide a powerful way to investigate the implicit body representation underlying touch. Here, we demonstrate that an intrinsic primary quality of a tactile object, for example its size, is directly affected by the perceived size of the body part touching it. We elicited proprioceptive illusions that the left index finger was either elongating or shrinking by vibrating the biceps or triceps tendon of the right arm while subjects grasped the tip of their left index finger. Subjects estimated the distance between two simultaneous tactile contacts on the left finger during tendon vibration. We found that tactile distances feel bigger when the touched body part feels elongated. Control tests showed that the modulation of touch was linked to the perceived index-finger size induced by tendon vibration. Vibrations that did not produce proprioceptive illusion had no effect on touch. Our results show that the perception of tactile objects is referenced to an implicit body representation and that proprioception contributes to this body representation. We also provide, for the first time, a quantitative, implicit measure of distortions of body size

    Investigating the Neural Mechanisms of Unconscious and Illusory Touch Perception

    Full text link
    Despite our everyday reliance on touch, from manipulating tools to dressing ourselves, relatively little is known about the neural correlates of tactile perception. As with other modalities, our conscious, reportable experiences of touch can dissociate from the physical tactile stimulation processed by the skin. In unconscious touch perception, tactile stimuli can be processed and guide our behavior without an accompanying conscious percept. For example, we may swat away a mosquito without having consciously registered its presence on our skin. In tactile illusions, conscious tactile experiences occur without a corresponding tactile stimulus. Multisensory tactile illusions arise when stimulation of a different modality influences conscious tactile perception. Using converging neuroscientific methods, we characterized the neural mechanisms underlying these two types of dissociations in touch perception. In a first experiment, we assessed the role of primary somatosensory cortex (S1) in conscious and unconscious touch perception using transcranial magnetic stimulation (TMS). We demonstrated for the very first time the existence of TMS-induced numbsense, whereby the disruption of S1 suppressed tactile awareness but left unconscious localization of touch above-chance. In a second experiment, we assessed the role of early somatosensory activity in a visually induced tactile illusion. We recorded electroencephalographic (EEG) activity and fast-signal optical imaging over the somatosensory cortex and found activations in S1 and S2 starting 128 ms after visual stimulus presentation associated with this illusion. These findings imply the involvement of early somatosensory representations in a multisensory illusion of touch. In a follow-up experiment, we explored the roles of S1 and of the PPC, a multimodal structure known to participate in the integration of visual and tactile signals, in this visually induced tactile illusion. Unexpectedly, stimulating S1 did not reduce visually induced tactile illusions, suggesting that these may rely on other somatosensory processes that can compensate for S1 suppression. Stimulating the PPC 140 ms after visual stimulus presentation caused a significant decrease in the visual facilitation of tactile sensitivity, likely due to an increase in visually induced tactile illusions. This demonstrated the role of the PPC in improving tactile sensitivity during visual tactile multisensory integration, and suggests it may also influence visually induced tactile illusions. In our last experiment, we used a psychophysical approach to understanding the behavioral mappings between auditory and tactile perception in sound-touch synesthesia, in which individuals experience consistent and reliable sound-induced tactile illusions. We found that sound frequency was strongly correlated with the location of synesthetic tactile illusions on the body, suggesting the involvement of early, somatotopically-organized somatosensory areas. Together, our results support the critical importance of early somatosensory brain areas in both unconscious and illusory touch perception, as well as in multisensory integration. These results provide an important insight into the neural mechanisms supporting the subjective aspects of touch perception

    More than skin deep: body representation beyond primary somatosensory cortex

    Get PDF
    The neural circuits underlying initial sensory processing of somatic information are relatively well understood. In contrast, the processes that go beyond primary somatosensation to create more abstract representations related to the body are less clear. In this review, we focus on two classes of higher-order processing beyond somatosensation. Somatoperception refers to the process of perceiving the body itself, and particularly of ensuring somatic perceptual constancy. We review three key elements of somatoperception: (a) remapping information from the body surface into an egocentric reference frame (b) exteroceptive perception of objects in the external world through their contact with the body and (c) interoceptive percepts about the nature and state of the body itself. Somatorepresentation, in contrast, refers to the essentially cognitive process of constructing semantic knowledge and attitudes about the body, including: (d) lexical-semantic knowledge about bodies generally and one’s own body specifically, (e) configural knowledge about the structure of bodies, (f) emotions and attitudes directed towards one’s own body, and (g) the link between physical body and psychological self. We review a wide range of neuropsychological, neuroimaging and neurophysiological data to explore the dissociation between these different aspects of higher somatosensory function

    Specificity and coherence of body representations

    Get PDF
    Bodily illusions differently affect body representations underlying perception and action. We investigated whether this task dependence reflects two distinct dimensions of embodiment: the sense of agency and the sense of the body as a coherent whole. In experiment 1 the sense of agency was manipulated by comparing active versus passive movements during the induction phase in a video rubber hand illusion (vRHI) setup. After induction, proprioceptive biases were measured both by perceptual judgments of hand position, as well as by measuring end-point accuracy of subjects' active pointing movements to an external object with the affected hand. The results showed, first, that the vRHI is largely perceptual: passive perceptual localisation judgments were altered, but end-point accuracy of active pointing responses with the affected hand to an external object was unaffected. Second, within the perceptual judgments, there was a novel congruence effect, such that perceptual biases were larger following passive induction of vRHI than following active induction. There was a trend for the converse effect for pointing responses, with larger pointing bias following active induction. In experiment 2, we used the traditional RHI to investigate the coherence of body representation by synchronous stimulation of either matching or mismatching fingers on the rubber hand and the participant's own hand. Stimulation of matching fingers induced a local proprioceptive bias for only the stimulated finger, but did not affect the perceived shape of the hand as a whole. In contrast, stimulation of spatially mismatching fingers eliminated the RHI entirely. The present results show that (i) the sense of agency during illusion induction has specific effects, depending on whether we represent our body for perception or to guide action, and (ii) representations of specific body parts can be altered without affecting perception of the spatial configuration of the body as a whole

    Virtual Hand Illusion Induced by Visuomotor Correlations

    Get PDF
    Our body schema gives the subjective impression of being highly stable. However, a number of easily-evoked illusions illustrate its remarkable malleability. In the rubber-hand illusion, illusory ownership of a rubber-hand is evoked by synchronous visual and tactile stimulation on a visible rubber arm and on the hidden real arm. Ownership is concurrent with a proprioceptive illusion of displacement of the arm position towards the fake arm. We have previously shown that this illusion of ownership plus the proprioceptive displacement also occurs towards a virtual 3D projection of an arm when the appropriate synchronous visuotactile stimulation is provided. Our objective here was to explore whether these illusions (ownership and proprioceptive displacement) can be induced by only synchronous visuomotor stimulation, in the absence of tactile stimulation

    Bodily awareness and novel multisensory features

    Get PDF
    According to the decomposition thesis, perceptual experiences resolve without remainder into their different modality-specific components. Contrary to this view, I argue that certain cases of multisensory integration give rise to experiences representing features of a novel type. Through the coordinated use of bodily awareness—understood here as encompassing both proprioception and kinaesthesis—and the exteroceptive sensory modalities, one becomes perceptually responsive to spatial features whose instances couldn’t be represented by any of the contributing modalities functioning in isolation. I develop an argument for this conclusion focusing on two cases: 3D shape perception in haptic touch and experiencing an object’s egocentric location in crossmodally accessible, environmental space
    • 

    corecore