51 research outputs found

    Fearful faces modulate spatial processing in peripersonal space: An ERP study

    Get PDF
    Peripersonal space (PPS) represents the region of space surrounding the body. A pivotal function of PPS is to coordinate defensive responses to threat. We have previously shown that a centrally-presented, looming fearful face, signalling a potential threat in one's surroundings, modulates spatial processing by promoting a redirection of sensory resources away from the face towards the periphery, where the threat may be expected – but only when the face is presented in near, rather than far space. Here, we use electrophysiological measures to investigate the neural mechanism underlying this effect. Participants made simple responses to tactile stimuli delivered on the cheeks, while watching task-irrelevant neutral or fearful avatar faces, looming towards them either in near or far space. Simultaneously with the tactile stimulation, a ball with a checkerboard pattern (probe) appeared to the left or right of the avatar face. Crucially, this probe could either be close to the avatar face, and thus more central in the participant's vision, or further away from the avatar face, and thus more peripheral in the participant's vision. Electroencephalography was continuously recorded. Behavioural results confirmed that in near space only, and for fearful relative to neutral faces, tactile processing was facilitated by the peripheral compared to the central probe. This behavioural effect was accompanied by a reduction of the N1 mean amplitude elicited by the peripheral probe for fearful relative to neutral faces. Moreover, the faster the participants responded to tactile stimuli with the peripheral probe, relative to the central, the smaller was their N1. Together these results, suggest that fearful faces intruding into PPS may increase expectation of a visual event occurring in the periphery. This fear-induced effect would enhance the defensive function of PPS when it is most needed, i.e., when the source of threat is nearby, but its location remains unknown

    Toward Robots with Peripersonal Space Representation for Adaptive Behaviors

    Get PDF
    The abilities to adapt and act autonomously in an unstructured and human-oriented environment are necessarily vital for the next generation of robots, which aim to safely cooperate with humans. While this adaptability is natural and feasible for humans, it is still very complex and challenging for robots. Observations and findings from psychology and neuroscience in respect to the development of the human sensorimotor system can inform the development of novel approaches to adaptive robotics. Among these is the formation of the representation of space closely surrounding the body, the Peripersonal Space (PPS) , from multisensory sources like vision, hearing, touch and proprioception, which helps to facilitate human activities within their surroundings. Taking inspiration from the virtual safety margin formed by the PPS representation in humans, this thesis first constructs an equivalent model of the safety zone for each body part of the iCub humanoid robot. This PPS layer serves as a distributed collision predictor, which translates visually detected objects approaching a robot\u2019s body parts (e.g., arm, hand) into the probabilities of a collision between those objects and body parts. This leads to adaptive avoidance behaviors in the robot via an optimization-based reactive controller. Notably, this visual reactive control pipeline can also seamlessly incorporate tactile input to guarantee safety in both pre- and post-collision phases in physical Human-Robot Interaction (pHRI). Concurrently, the controller is also able to take into account multiple targets (of manipulation reaching tasks) generated by a multiple Cartesian point planner. All components, namely the PPS, the multi-target motion planner (for manipulation reaching tasks), the reaching-with-avoidance controller and the humancentred visual perception, are combined harmoniously to form a hybrid control framework designed to provide safety for robots\u2019 interactions in a cluttered environment shared with human partners. Later, motivated by the development of manipulation skills in infants, in which the multisensory integration is thought to play an important role, a learning framework is proposed to allow a robot to learn the processes of forming sensory representations, namely visuomotor and visuotactile, from their own motor activities in the environment. Both multisensory integration models are constructed with Deep Neural Networks (DNNs) in such a way that their outputs are represented in motor space to facilitate the robot\u2019s subsequent actions

    Approaching human-like spatial awareness in social robotics: an investigation of spatial interaction strategies with a receptionist robot

    Get PDF
    Holthaus P. Approaching human-like spatial awareness in social robotics: an investigation of spatial interaction strategies with a receptionist robot. Bielefeld: Universität Bielefeld.; 2014.This doctoral thesis investigates the influence of social signals in the spatial domain that aim to raise a robot’s awareness towards its human interlocutor. A concept of spatial awareness thereby extends the robot’s possibilities for expressing its knowledge about the situation as well as its own capabilities. As a result, especially untrained users can build up more appropriate expectations about the current situation which supposedly leads to a minimization of misunderstandings and thereby an enhancement of user experience. On the background of research that investigates communication among humans, relations are drawn in order to utilize gained insights for developing a robot that is capable of acting socially intelligent with regard to human-like treatment of spatial configurations and signals. In a study-driven approach, an integrated concept of spatial awareness is therefore proposed. An important aspect of that concept, which is founded in its spatial extent, lies in its aspiration to cover a holistic encounter between human and robot with the goal to improve user experience from the first sight until the end of reciprocal awareness. It describes how spatial configurations and signals can be perceived and interpreted in a social robot. Furthermore, it also presents signals and behavioral properties for such a robot that target at influencing said configurations and enhancing robot verbosity. In order to approve the concept’s validity in realistic settings, an interactive scenario is presented in the form of a receptionist robot to which it is applied. In the context of this setup, a comprehensive user study is conducted that verifies the implementation of spatial awareness to be beneficial for an interaction with humans that are naive to the subject. Furthermore, the importance of addressing an entire encounter in human-robot interaction is confirmed as well as a strong interdependency of a robot’s social signals among each other

    What the study of spinal cord injured patients can tell us about the significance of the body in cognition

    Get PDF
    Although in the last three decades philosophers, psychologists and neuroscientists have produced numerous studies on human cognition, the debate concerning its nature is still heated and current views on the subject are somewhat antithetical. On the one hand, there are those who adhere to a view implying ‘disembodiment’ which suggests that cognition is based entirely on symbolic processes. On the other hand, a family of theories referred to as the Embodied Cognition Theories (ECT) postulate that creating and maintaining cognition is linked with varying degrees of inherence to somatosensory and motor representations. Spinal cord injury induces a massive body-brain disconnection with the loss of sensory and motor bodily functions below the lesion level but without directly affecting the brain. Thus, SCI may represent an optimal model for testing the role of the body in cognition. In this review, we describe post-lesional cognitive modifications in relation to body, space and action representations and various instances of ECT. We discuss the interaction between body-grounded and symbolic processes in adulthood with relevant modifications after body-brain disconnection

    The spatial logic of fear

    Get PDF
    Peripersonal space (PPS) is the multimodal sensorimotor representation of the space surrounding the body. This thesis investigates how PPS is modulated by emotional faces, which represent particularly salient cue in our environment. Study 1 shows that looming neutral, joyful, and angry faces gradually facilitate motor responses to tactile stimuli. Conversely, looming fearful faces show no such effect. Also, at the closest position in PPS, multisensory response facilitation is lower for fearful than neutral faces. Study 2a addresses the hypothesis that fearful faces promote a redirection of attention towards the peripheral space. In line with this, it shows that motor responses to tactile stimuli are facilitated when a looming fearful face is associated with the appearance of a visual element presented in the periphery, rather than close to the face. Also, this effect is found in near space and not in far space. This result suggests that a near looming fearful face elicits a redirection of attention to the peripheral space. Such effect is not found for neutral, joyful, or angry faces (Study 2b). Study 3 shows that the redirection of attention in PPS by fearful faces is accompanied by a modulation of the electrophysiological signal associated with face processing (N170). Finally, Study 4 shows that the skin conductance response to looming fearful, but not joyful or neutral faces, is modulated by the distance of the face from participants’ body, being maximal in the near space. Together these studies show that, at variance with other emotions, fearful faces shift attention to other portions of space - than that of the face - where the threat may be located. It is argued that this fear-evoked redirection of attention may enhance the defensive function of PPS, when most needed, i.e., when the source of threat is nearby, but its location remains unknown

    Human Machine Interaction

    Get PDF
    In this book, the reader will find a set of papers divided into two sections. The first section presents different proposals focused on the human-machine interaction development process. The second section is devoted to different aspects of interaction, with a special emphasis on the physical interaction

    Measuring and Modulating Mimicry: Insights from Virtual Reality and Autism

    Get PDF
    Mimicry involves the unconscious imitation of other people’s behaviour. The social top-down response modulation (STORM) model has suggested that mimicry is a socially strategic behaviour which is modulated according to the social context, for example, we mimic more when someone is looking at us or if we want to affiliate with them. There has been a long debate over whether mimicry is different in autism, a condition characterised by differences in social interaction. STORM predicts that autistic people can and do mimic but do not change their mimicry behaviour according to the social context. Using a range of mimicry measures this thesis aimed to test STORM’s predictions. The first study employed a traditional reaction time measure of mimicry and demonstrated that direct gaze socially modulated mimicry responses in non-autistic adults but did not do so in autistic participants, in line with STORM’s predictions. In the next two studies, I found that non-autistic participants mimicked the movement trajectory of both virtual characters and human actors during an imitation game. Autistic participants also mimicked but did so to a lesser extent. However, this type of mimicry was resistant to the effects of social cues, such as eye-gaze and animacy, contrary to the predictions of STORM. In a fourth study, I manipulated the rationality of an actor’s movement trajectory and found that participants mimicked the trajectory even when the trajectory was rated as irrational. In a fifth study, I showed that people’s tendency to mimic the movements of others could change the choices that participants had previously made in private. This tendency was modulated by the kinematics of the character’s pointing movements. This thesis provides mixed support for STORM’s predictions and I discuss the reasons why this might be. I also make suggestions for how to better measure and modulate mimicry

    Haptic and Audio-visual Stimuli: Enhancing Experiences and Interaction

    Get PDF

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications
    • …
    corecore