38 research outputs found
Recommended from our members
Peripersonal space: a multisensory interface for body-object interactions
Research in the last four decades has brought a considerable advance in our understanding of how the brain synthesizes information arising from different sensory modalities. Indeed, many cortical and subcortical areas, beyond those traditionally considered to be ‘associative,’ have been shown to be involved in multisensory interaction and integration (Ghazanfar and Schroeder 2006). Visuo-tactile interaction is of particular interest, because of the prominent role played by vision in guiding our actions and anticipating their tactile consequences in everyday life. In this chapter, we focus on the functional role that visuo-tactile processing may play in driving two types of body-object interactions: avoidance and approach. We will first review some basic features of visuo-tactile interactions, as revealed by electrophysiological studies in monkeys. These will prove to be relevant for interpreting the subsequent evidence arising from human studies. A crucial point that will be stressed is that these visuo-tactile mechanisms have not only sensory, but also motor-related activity that qualifies them as multisensory-motor interfaces. Evidence will then be presented for the existence of functionally homologous processing in the human brain, both from neuropsychological research in brain-damaged patients and in healthy participants. The final part of the chapter will focus on some recent studies in humans showing that the human motor system is provided with a multisensory interface that allows for continuous monitoring of the space near the body (i.e., peripersonal space). We further demonstrate that multisensory processing can be modulated on-line as a consequence of interacting with objects. This indicates that, far from being passive, the monitoring of peripersonal space is an active process subserving actions between our body and objects located in the space around us
L’espace péripersonnel : une interface ultisensorielle pour les interactions entre le corps et les objets
Our ability to interact with the environment requires the integration of multisensory information for the construction of spatial representations. The peripersonal space (i.e., the sector of space closely surrounding one’s body) and the integrative processes between visual and tactile inputs originating from this sector of space have been at the center of recent years investigations. Neurophysiological studies provided evidence for the presence in the monkey brain of bimodal neurons, which are activated by tactile as well as visual information delivered near to a specific body part (e.g., the hand). Neuropsychological studies on right brain-damaged patients who present extinction and functional neuroimaging findings suggest the presence of similar bimodal systems in the human brain. Studies on the effects of tool-use on visual-tactile interaction revealed similar dynamic properties of the peripersonal space in monkeys and humans. The functional role of the multisensory coding of peripersonal space is, in our hypothesis, that of providing the brain with a sensori-motor interface for body-objects interactions. Thus, not only it could be involved in driving involuntary defensive movements in response to objects approaching the body, but could be also dynamically maintained and updated as a function of manual voluntary actions performed towards objects in the reaching space. We tested the hypothesis of an involvement of peripersonal space in executing both voluntary and defensive actions. To these aims, we joined a well known cross-modal congruency effect between visual and tactile information to a kinematic approach to demonstrate that voluntary grasping actions induce an on-line re-weighting of multisensory interactions in the peripersonal space. We additionally show that this modulation is handcentred. We also used a motor evoked potentials approach to investigate which coordinates system is used to code the peripersonal space during motor preparation if real objects rapidly approach the body. Our findings provide direct evidence for automatic hand-centred coding of visual space and suggest that peripersonal space may also serve to represent rapidly 3 approaching and potentially noxious objects, thus enabling the rapid selection of appropriate motor responses. These results clearly show that peripersonal space is a multisensori-motor interface that might have been selected through evolution for optimising the interactions between the body and the objects in the external world.Notre habilité à interagir avec les objets du monde nécessite l’intégration d’informations provenant de différents canaux sensoriels, dans le cadre de la construction d’une représentation de l’espace en particulier des informations visuelles et tactiles. L’espace péri personnel et l’intégration visuo-tactile ont été l’objet d’importantes recherche récemment. Des études neuro physiologiques chez le primate non-humain ont montré l’existence de neurones bi modaux activés à la fois par des stimulations tactiles et par des stimulations visuelles si ces dernières étaient présentées près d’une partie du corps (par exemple la main). Il a été proposé que ces neurones bi-modaux constituent le substrat neuronal de la représentation de l’espace péri personnel. Les études neuropsychologiques menées chez des patients présentant une extinction cross-modale consécutive à une lésion pariétale droite ont permis de suggérer l’existence du même type de représentation de l’espace péri personnel chez l’homme. Les données issues des études en neuro imagerie fonctionnelle sont venues par la suite conforter cette idée. Plus récemment, à travers l’utilisation d’outils, des données acquises chez le primate humain et non humain ont révélé les propriétés dynamiques de cette représentation spatiale. Selon notre hypothèse la représentation de l’espace péri personnel est une interface présidant aux interactions du corps avec les objets du monde externe. Nous avons donc évalué le rôle et l’état de l’espace péri personnel lors de l’exécution de mouvements volontaires vers des objets (comme une simple saisie) et lors de mouvements involontaires d’évitement. Lors d’une première série d’expériences nous avons étudié les coordonnées spatiales du codage des objets qui soudainement se rapprochent du corps grâce à la mesure des potentiels évoqués moteurs. Cette étude a révélé que l’espace péri personnel joue un rôle dans la représentation des objets approchant le corps et dans la sélection des mouvements appropriés en réponse. Lors d’une seconde série d’expériences nous avons utilisé un paradigme d’interférence visuo-tactile couplé à l’enregistrement cinématique des mouvements de saisie afin d’examiner la représentation de l’espace péri personnel lors de 1 l’exécution d’actions volontaires. Cette approche novatrice nous a permis de mettre en évidence que l’action volontaire induit un recodage en ligne de l’interaction visuo-tactile dans l’espace de préhension. Ce recodage de l’action s’effectue en coordonnées centrées sur la partie du corps qui exécute l’action. En conclusion nos études expérimentales démontrent que l’espace péri personnel est une interface multi sensorielle qui a été sélectionnée à travers l’évolution non seulement pour la gestion des mouvements d’évitement et de défense mais également pour l’exécution d’actions volontaires
In order for us to successfully interact with our environment, it is crucial that our brain will be able to represent spatial information with respect to our effectors
Abstract: The existence of hand-centred visual processing has long been established in the macaque premotor cortex. These hand-centred mechanisms have been thought to play some general role in the sensory guidance of movements towards objects, or, more recently, in the sensory guidance of object avoidance movements. We suggest that these hand-centred mechanisms play a specific and prominent role in the rapid selection and control of manual actions following sudden changes in the properties of the objects relevant for hand-object interactions. We discuss recent anatomical and physiological evidence from human and non-human primates, which indicates the existence of rapid processing of visual information for hand-object interactions. This new evidence indicate how several stages of the hierarchical visual processing system may be bypassed, feeding the motor system with hand-related visual inputs within just 70 ms following a sudden event. This time-window is early enough, and this processing rapid enough, to allow the generation and control of rapid handcentred avoidance and acquisitive actions, for aversive and desired objects, respectively
Associative learning in peripersonal space: fear responses are acquired in hand-centered coordinates
International audienceAssociative fear learning takes place in hand-centered coordinates. Using a Pavlovian fear-learning paradigm, we show that the anticipatory skin conductance response indicating the association between the negative value and an initially neutral stimulus is acquired and then remapped in space when the stimulated body part moves to a different position. These results demonstrate the relationship between the representation of peripersonal space and the encoding of threatening stimuli. Hypotheses concerning the underlying neural network are discussed
Aim and Plausibility of Action Chains Remap Peripersonal Space
International audienceSuccessful interaction with objects in the peripersonal space requires that the information relative to current and upcoming positions of our body is continuously monitored and updated with respect to the location of target objects. Voluntary actions, for example, are known to induce an anticipatory remapping of the peri-hand space (PHS, i.e., the space near the acting hand) during the very early stages of the action chain: planning and initiating an object grasp increase the interference exerted by visual stimuli coming from the object on touches delivered to the grasping hand, thus allowing for hand-object position monitoring and guidance. Voluntarily grasping an object, though, is rarely performed in isolation. Grasping a candy, for example, is most typically followed by concatenated secondary action steps (bringing the candy to the mouth and swallowing it) that represent the agent's ultimate intention (to eat the candy). However, whether and when complex action chains remap the PHS remains unknown, just as whether remapping is conditional to goal achievability (e.g., candy-mouth fit). Here we asked these questions by assessing changes in visuo-tactile interference on the acting hand while participants had to grasp an object serving as a support for an elongated candy, and bring it toward their mouth. Depending on its orientation, the candy could potentially enter the participants' mouth (plausible goal), or not (implausible goal). We observed increased visuo-tactile interference at relatively late stages of the action chain, after the object had been grasped, and only when the action goal was plausible. These findings suggest that multisensory interactions during action execution depend upon the final aim and plausibility of complex goal-directed actions, and extend our knowledge about the role of peripersonal space in guiding goaldirected voluntary actions
Neural resources shared by language and tool-use: a basis for tool-use benefits over syntactic abilities
International audienc