3 research outputs found

    Physical Analysis of Handshaking Between Humans: Mutual Synchronisation and Social Context

    Get PDF
    International audienceOne very popular form of interpersonal interaction used in various situations is the handshake (HS), which is an act that is both physical and social. This article aims to demonstrate that the paradigm of synchrony that refers to the psychology of individuals' temporal movement coordination could also be considered in handshaking. For this purpose, the physical features of the human HS are investigated in two different social situations: greeting and consolation. The duration and frequency of the HS and the force of the grip have been measured and compared using a prototype of a wearable system equipped with several sensors. The results show that an HS can be decomposed into four phases, and after a short physical contact, a synchrony emerges between the two persons who are shaking hands. A statistical analysis conducted on 31 persons showed that, in the two different contexts, there is a significant difference in the duration of HS, but the frequency of motion and time needed to synchronize were not impacted by the context of an interaction

    Brain-Inspired Coding of Robot Body Schema Through Visuo-Motor Integration of Touched Events

    Get PDF
    International audienceRepresenting objects in space is difficult because sensorimotor events are anchored in different reference frames, which can be either eye-, armor or target-centered. In the brain, Gain-Field (GF) neurons in the parietal cortex are involved in computing the necessary spatial transformations for aligning the tactile, visual and proprioceptive signals. In reaching tasks, these GF neurons exploit a mechanism based on multiplicative interaction for binding simultaneously touched events from the hand with visual and proprioception information.By doing so, they can infer new reference frames to represent dynamically the location of the body parts in the visual space (i.e., the body schema) and nearby targets (i.e., its peripersonal space). In this line, we propose a neural model based on GF neurons for integrating tactile events with arm postures and visual locations for constructing hand-and target-centered receptive fields in the visual space. In robotic experiments using an artificial skin, we show how our neural architecture reproduces the behaviors of parietal neurons (1) for encoding dynamically the body schema of our robotic arm without any visual tags on it and (2) for estimating the relative orientation and distance of targets to it. We demonstrate how tactile information facilitates the integration of visual and proprioceptive signals in order to construct the body space
    corecore