236 research outputs found
Recommended from our members
Peripersonal space: a multisensory interface for body-object interactions
Research in the last four decades has brought a considerable advance in our understanding of how the brain synthesizes information arising from different sensory modalities. Indeed, many cortical and subcortical areas, beyond those traditionally considered to be ‘associative,’ have been shown to be involved in multisensory interaction and integration (Ghazanfar and Schroeder 2006). Visuo-tactile interaction is of particular interest, because of the prominent role played by vision in guiding our actions and anticipating their tactile consequences in everyday life. In this chapter, we focus on the functional role that visuo-tactile processing may play in driving two types of body-object interactions: avoidance and approach. We will first review some basic features of visuo-tactile interactions, as revealed by electrophysiological studies in monkeys. These will prove to be relevant for interpreting the subsequent evidence arising from human studies. A crucial point that will be stressed is that these visuo-tactile mechanisms have not only sensory, but also motor-related activity that qualifies them as multisensory-motor interfaces. Evidence will then be presented for the existence of functionally homologous processing in the human brain, both from neuropsychological research in brain-damaged patients and in healthy participants. The final part of the chapter will focus on some recent studies in humans showing that the human motor system is provided with a multisensory interface that allows for continuous monitoring of the space near the body (i.e., peripersonal space). We further demonstrate that multisensory processing can be modulated on-line as a consequence of interacting with objects. This indicates that, far from being passive, the monitoring of peripersonal space is an active process subserving actions between our body and objects located in the space around us
Recommended from our members
The spatial distance rule in the moving and classical rubber hand illusions
The rubber hand illusion (RHI) is a perceptual illusion in which participants perceive a model hand as part of their own body. Here, through the use of one questionnaire experiment and two proprioceptive drift experiments, we investigated the effect of distance (12, 27.5, and 43 cm) in the vertical plane on both the moving and classical RHI. In both versions of the illusion, we found an effect of distance on ownership of the rubber hand for both measures tested. Our results further suggested that the moving RHI might follow a narrower spatial rule. Finally, whereas ownership of the moving rubber hand was affected by distance, this was not the case for agency, which was present at all distances tested. In sum, the present results generalize the spatial distance rule in terms of ownership to the vertical plane of space and demonstrate that also the moving RHI obeys this rule
The effects of visual control and distance in modulating peripersonal spatial representation
In the presence of vision, finalized motor acts can trigger spatial remapping, i.e., reference frames transformations to allow for a better interaction with targets. However, it is yet unclear how the peripersonal space is encoded and remapped depending on the availability of visual feedback and on the target position within the individual’s reachable space, and which cerebral areas subserve such processes. Here, functional magnetic resonance imaging (fMRI) was used to examine neural activity while healthy young participants performed reach-to-grasp movements with and without visual feedback and at different distances of the target from the effector (near to the hand–about 15 cm from the starting position–vs. far from the hand–about 30 cm from the starting position). Brain response in the superior parietal lobule bilaterally, in the right dorsal premotor cortex, and in the anterior part of the right inferior parietal lobule was significantly greater during visually-guided grasping of targets located at the far distance compared to grasping of targets located near to the hand. In the absence of visual feedback, the inferior parietal lobule exhibited a greater activity during grasping of targets at the near compared to the far distance. Results suggest that in the presence of visual feedback, a visuo-motor circuit integrates visuo-motor information when targets are located farther away. Conversely in the absence of visual feedback, encoding of space may demand multisensory remapping processes, even in the case of more proximal targets
Number magnitude to finger mapping is disembodied and topological
It has been shown that humans associate fingers with numbers because finger counting strategies interact with numerical judgements. At the same time, there is evidence that there is a relation between number magnitude and space as small to large numbers seem to be represented from left to right. In the present study, we investigated whether number magnitude to finger mapping is embodied (related to the order of fingers on the hand) or disembodied (spatial). We let healthy human volunteers name random numbers between 1 and 30, while simultaneously tapping a random finger. Either the hands were placed directly next to each other, 30 cm apart, or the hands were crossed such that the left hand was on the right side of the body mid-line. The results show that naming a smaller number than the previous one was associated with tapping a finger to the left of the previously tapped finger. This shows that there is a spatial (disembodied) mapping between number magnitude and fingers. Furthermore, we show that this mapping is topological rather than metrically scaled
Tool use imagery triggers tool incorporation in the body schema
Baccarini M, Martel M, Cardinali L, Sillan O, Farnè A, Roy AC. Tool use imagery triggers tool incorporation in the body schema. Frontiers in Psychology. 2014;2014(5): 492.Tool-use has been shown to modify the way the brain represents the metrical characteristics of the effector controlling the tool. For example, the use of tools that elongate the physical length of the arm induces kinematic changes affecting selectively the transport component of subsequent free-hand movements. Although mental simulation of an action is known to involve -to a large extent- the same processes as those at play in overt motor execution, whether tool-use imagery can yield similar effects on the body representation remains unknown. Mentally simulated actions indeed elicit autonomic physiological responses and follow motor execution rules that are comparable to those associated with the correspondent overt performance. Therefore, here we investigated the effects of the mental simulation of actions performed with a tool on the body representation by studying subsequent free-hand movements. Subjects executed reach to grasp movements with their hand before and after an imagery task performed with either a tool elongating their arm length or, as a control, with their hand alone. Two main results were found: First, in agreement with previous studies, durations of imagined movements performed with the tool and the hand were similarly affected by task difficulty. Second, kinematics of free-hand movements was affected after tool-use imagery, but not hand-use imagery, in a way similar to that previously documented after actual tool-use. These findings constitute the first evidence that tool-use imagery is sufficient to affect the representation of the user's arm
Fronto-parietal brain responses to visuotactile congruence in an anatomical reference frame
Spatially and temporally congruent visuotactile stimulation of a fake hand
together with one’s real hand may result in an illusory self-attribution of
the fake hand. Although this illusion relies on a representation of the two
touched body parts in external space, there is tentative evidence that, for
the illusion to occur, the seen and felt touches also need to be congruent in
an anatomical reference frame. We used functional magnetic resonance imaging
and a somatotopical, virtual reality-based setup to isolate the neuronal basis
of such a comparison. Participants’ index or little finger was synchronously
touched with the index or little finger of a virtual hand, under congruent or
incongruent orientations of the real and virtual hands. The left ventral
premotor cortex responded significantly more strongly to visuotactile co-
stimulation of the same versus different fingers of the virtual and real hand.
Conversely, the left anterior intraparietal sulcus responded significantly
more strongly to co-stimulation of different versus same fingers. Both
responses were independent of hand orientation congruence and of spatial
congruence of the visuotactile stimuli. Our results suggest that fronto-
parietal areas previously associated with multisensory processing within
peripersonal space and with tactile remapping evaluate the congruence of
visuotactile stimulation on the body according to an anatomical reference
frame
Manipulable Objects Facilitate Cross-Modal Integration in Peripersonal Space
Previous studies have shown that tool use often modifies one's peripersonal space – i.e. the space directly surrounding our body. Given our profound experience with manipulable objects (e.g. a toothbrush, a comb or a teapot) in the present study we hypothesized that the observation of pictures representing manipulable objects would result in a remapping of peripersonal space as well. Subjects were required to report the location of vibrotactile stimuli delivered to the right hand, while ignoring visual distractors superimposed on pictures representing everyday objects. Pictures could represent objects that were of high manipulability (e.g. a cell phone), medium manipulability (e.g. a soap dispenser) and low manipulability (e.g. a computer screen). In the first experiment, when subjects attended to the action associated with the objects, a strong cross-modal congruency effect (CCE) was observed for pictures representing medium and high manipulability objects, reflected in faster reaction times if the vibrotactile stimulus and the visual distractor were in the same location, whereas no CCE was observed for low manipulability objects. This finding was replicated in a second experiment in which subjects attended to the visual properties of the objects. These findings suggest that the observation of manipulable objects facilitates cross-modal integration in peripersonal space
Recommended from our members
The moving rubber hand illusion revisited: comparing movements and visuotactile stimulation to induce illusory ownership
The rubber hand illusion is a perceptual illusion in which a model hand is experienced as
part of one’s own body. In the present study we directly compared the classical illusion,
based on visuotactile stimulation, with a rubber hand illusion based on active and passive
movements. We examined the question of which combinations of sensory and motor cues
are the most potent in inducing the illusion by subjective ratings and an objective measure
(proprioceptive drift). In particular, we were interested in whether the combination of
afferent and efferent signals in active movements results in the same illusion as in the
purely passive modes. Our results show that the illusion is equally strong in all three cases.
This demonstrates that different combinations of sensory input can lead to a very similar
phenomenological experience and indicates that the illusion can be induced by any combination
of multisensory information
- …