132 research outputs found

    If I Were You: Perceptual Illusion of Body Swapping

    Get PDF
    The concept of an individual swapping his or her body with that of another person has captured the imagination of writers and artists for decades. Although this topic has not been the subject of investigation in science, it exemplifies the fundamental question of why we have an ongoing experience of being located inside our bodies. Here we report a perceptual illusion of body-swapping that addresses directly this issue. Manipulation of the visual perspective, in combination with the receipt of correlated multisensory information from the body was sufficient to trigger the illusion that another person's body or an artificial body was one's own. This effect was so strong that people could experience being in another person's body when facing their own body and shaking hands with it. Our results are of fundamental importance because they identify the perceptual processes that produce the feeling of ownership of one's body

    The Role of Motor Learning in Spatial Adaptation near a Tool

    Get PDF
    Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented

    When Right Feels Left: Referral of Touch and Ownership between the Hands

    Get PDF
    Feeling touch on a body part is paradigmatically considered to require stimulation of tactile afferents from the body part in question, at least in healthy non-synaesthetic individuals. In contrast to this view, we report a perceptual illusion where people experience “phantom touches” on a right rubber hand when they see it brushed simultaneously with brushes applied to their left hand. Such illusory duplication and transfer of touch from the left to the right hand was only elicited when a homologous (i.e., left and right) pair of hands was brushed in synchrony for an extended period of time. This stimulation caused the majority of our participants to perceive the right rubber hand as their own and to sense two distinct touches – one located on the right rubber hand and the other on their left (stimulated) hand. This effect was supported by quantitative subjective reports in the form of questionnaires, behavioral data from a task in which participants pointed to the felt location of their right hand, and physiological evidence obtained by skin conductance responses when threatening the model hand. Our findings suggest that visual information augments subthreshold somatosensory responses in the ipsilateral hemisphere, thus producing a tactile experience from the non-stimulated body part. This finding is important because it reveals a new bilateral multisensory mechanism for tactile perception and limb ownership

    Movement of environmental threats modifies the relevance of the defensive eye-blink in a spatially-tuned manner.

    Get PDF
    Subcortical reflexive motor responses are under continuous cortical control to produce the most effective behaviour. For example, the excitability of brainstem circuitry subserving the defensive hand-blink reflex (HBR), a response elicited by intense somatosensory stimuli to the wrist, depends on a number of properties of the eliciting stimulus. These include face-hand proximity, which has allowed the description of an HBR response field around the face (commonly referred to as a defensive peripersonal space, DPPS), as well as stimulus movement and probability of stimulus occurrence. However, the effect of stimulus-independent movements of objects in the environment has not been explored. Here we used virtual reality to test whether and how the HBR-derived DPPS is affected by the presence and movement of threatening objects in the environment. In two experiments conducted on 40 healthy volunteers, we observed that threatening arrows flying towards the participant result in DPPS expansion, an effect directionally-tuned towards the source of the arrows. These results indicate that the excitability of brainstem circuitry subserving the HBR is continuously adjusted, taking into account the movement of environmental objects. Such adjustments fit in a framework where the relevance of defensive actions is continually evaluated, to maximise their survival value

    The Proprioceptive Map of the Arm Is Systematic and Stable, but Idiosyncratic

    Get PDF
    Visual and somatosensory signals participate together in providing an estimate of the hand's spatial location. While the ability of subjects to identify the spatial location of their hand based on visual and proprioceptive signals has previously been characterized, relatively few studies have examined in detail the spatial structure of the proprioceptive map of the arm. Here, we reconstructed and analyzed the spatial structure of the estimation errors that resulted when subjects reported the location of their unseen hand across a 2D horizontal workspace. Hand position estimation was mapped under four conditions: with and without tactile feedback, and with the right and left hands. In the task, we moved each subject's hand to one of 100 targets in the workspace while their eyes were closed. Then, we either a) applied tactile stimulation to the fingertip by allowing the index finger to touch the target or b) as a control, hovered the fingertip 2 cm above the target. After returning the hand to a neutral position, subjects opened their eyes to verbally report where their fingertip had been. We measured and analyzed both the direction and magnitude of the resulting estimation errors. Tactile feedback reduced the magnitude of these estimation errors, but did not change their overall structure. In addition, the spatial structure of these errors was idiosyncratic: each subject had a unique pattern of errors that was stable between hands and over time. Finally, we found that at the population level the magnitude of the estimation errors had a characteristic distribution over the workspace: errors were smallest closer to the body. The stability of estimation errors across conditions and time suggests the brain constructs a proprioceptive map that is reliable, even if it is not necessarily accurate. The idiosyncrasy across subjects emphasizes that each individual constructs a map that is unique to their own experiences

    The Illusion of Owning a Third Arm

    Get PDF
    Could it be possible that, in the not-so-distant future, we will be able to reshape the human body so as to have extra limbs? A third arm helping us out with the weekly shopping in the local grocery store, or an extra artificial limb assisting a paralysed person? Here we report a perceptual illusion in which a rubber right hand, placed beside the real hand in full view of the participant, is perceived as a supernumerary limb belonging to the participant's own body. This effect was supported by questionnaire data in conjunction with physiological evidence obtained from skin conductance responses when physically threatening either the rubber hand or the real one. In four well-controlled experiments, we demonstrate the minimal required conditions for the elicitation of this “supernumerary hand illusion”. In the fifth, and final experiment, we show that the illusion reported here is qualitatively different from the traditional rubber hand illusion as it is characterised by less disownership of the real hand and a stronger feeling of having two right hands. These results suggest that the artificial hand ‘borrows’ some of the multisensory processes that represent the real hand, leading to duplication of touch and ownership of two right arms. This work represents a major advance because it challenges the traditional view of the gross morphology of the human body as a fundamental constraint on what we can come to experience as our physical self, by showing that the body representation can easily be updated to incorporate an additional limb

    Peripersonal Space and Margin of Safety around the Body: Learning Visuo-Tactile Associations in a Humanoid Robot with Artificial Skin

    Get PDF
    This paper investigates a biologically motivated model of peripersonal space through its implementation on a humanoid robot. Guided by the present understanding of the neurophysiology of the fronto-parietal system, we developed a computational model inspired by the receptive fields of polymodal neurons identified, for example, in brain areas F4 and VIP. The experiments on the iCub humanoid robot show that the peripersonal space representation i) can be learned efficiently and in real-time via a simple interaction with the robot, ii) can lead to the generation of behaviors like avoidance and reaching, and iii) can contribute to the understanding the biological principle of motor equivalence. More specifically, with respect to i) the present model contributes to hypothesizing a learning mechanisms for peripersonal space. In relation to point ii) we show how a relatively simple controller can exploit the learned receptive fields to generate either avoidance or reaching of an incoming stimulus and for iii) we show how the robot can select arbitrary body parts as the controlled end-point of an avoidance or reaching movement

    The weight of representing the body: addressing the potentially indefinite number of body representations in healthy individuals

    Get PDF
    There is little consensus about the characteristics and number of body representations in the brain. In the present paper, we examine the main problems that are encountered when trying to dissociate multiple body representations in healthy individuals with the use of bodily illusions. Traditionally, task-dependent bodily illusion effects have been taken as evidence for dissociable underlying body representations. Although this reasoning holds well when the dissociation is made between different types of tasks that are closely linked to different body representations, it becomes problematic when found within the same response task (i.e., within the same type of representation). Hence, this experimental approach to investigating body representations runs the risk of identifying as many different body representations as there are significantly different experimental outputs. Here, we discuss and illustrate a different approach to this pluralism by shifting the focus towards investigating task-dependency of illusion outputs in combination with the type of multisensory input. Finally, we present two examples of behavioural bodily illusion experiments and apply Bayesian model selection to illustrate how this different approach of dissociating and classifying multiple body representations can be applied

    Fake hands in action: embodiment and control of supernumerary limbs

    Get PDF
    Demonstrations that the brain can incorporate a fake limb into our bodily representations when stroked in synchrony with our unseen real hand [(the rubber hand illusion (RHI)] are now commonplace. Such demonstrations highlight the dynamic flexibility of the perceptual body image, but evidence for comparable RHI-sensitive changes in the body schema used for action is less common. Recent evidence from the RHI supports a distinction between bodily representations for perception (body image) and for action (body schema) (Kammers et al. in Neuropsychologia 44:2430–2436, 2006). The current study challenges and extends these findings by demonstrating that active synchronous stroking of a brush not only elicits perceptual embodiment of a fake limb (body image) but also affects subsequent reaching error (body schema). Participants were presented with two moving fake left hands. When only one was synchronous during active touch, ownership was claimed for the synchronous hand only and the accuracy of reaching was consistent with control of the synchronous hand. When both fake hands were synchronous, ownership was claimed over both, but only one was controlled. Thus, it would appear that fake limbs can be incorporated into the body schema as well as the body image, but while multiple limbs can be incorporated into the body image, the body schema can accommodate only one

    Spatially uninformative sounds increase sensitivity for visual motion change

    Get PDF
    It has recently been shown that spatially uninformative sounds can cause a visual stimulus to pop out from an array of similar distractor stimuli when that sound is presented in temporal proximity to a feature change in the visual stimulus. Until now, this effect has predominantly been demonstrated by using stationary stimuli. Here, we extended these results by showing that auditory stimuli can also improve the sensitivity of visual motion change detection. To accomplish this, we presented moving visual stimuli (small dots) on a computer screen. At a random moment during a trial, one of these stimuli could abruptly move in an orthogonal direction. Participants’ task was to indicate whether such an abrupt motion change occurred or not by making a corresponding button press. If a sound (a short 1,000 Hz tone pip) co-occurred with the abrupt motion change, participants were able to detect this motion change more frequently than when the sound was not present. Using measures derived from signal detection theory, we were able to demonstrate that the effect on accuracy was due to increased sensitivity rather than to changes in response bias
    corecore