407 research outputs found

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    A New Virtual Reality Interface for Underwater Intervention Missions

    Get PDF
    Ponencia presentada en IFAC-PapersOnLine, Volume 53, Issue 2, 2020, Pages 14600-14607Nowadays, most underwater intervention missions are developed through the well-known work-class ROVs (Remote Operated Vehicles), equipped with teleoperated arms under human supervision. Thus, despite the appearance on the market of the first prototypes of the so-called I-AUV (Autonomous Underwater Vehicles for Intervention), the most mature technology associated with ROVs continues to be trusted. In order to fill the gap between ROVs and incipient I-AUVs technology, new research is under progress in our laboratory. In particular, new HRI (Human Robot Interaction) capabilities are being tested inside a three-year Spanish coordinated project focused on cooperative underwater intervention missions. In this work new results are presented concerning a new user interface which includes immersion capabilities through Virtual Reality (VR) technology. It is worth noting that a new HRI module has been demonstrated, through a pilot study, in which the users had to solve some specific tasks, with minimum guidance and instructions, following simple Problem Based Learning (PBL) scheme. Finally, it is noticeable that, although this is only a work in progress, the obtained results are promising concerning friendly and intuitive characteristics of the developed HRI module. Thus, some critical aspects, like complexity fall, training time and cognitive fatigue of the ROV pilot, seem more affordable now

    A Systematic Review on Reproducibility in Child-Robot Interaction

    Full text link
    Research reproducibility - i.e., rerunning analyses on original data to replicate the results - is paramount for guaranteeing scientific validity. However, reproducibility is often very challenging, especially in research fields where multi-disciplinary teams are involved, such as child-robot interaction (CRI). This paper presents a systematic review of the last three years (2020-2022) of research in CRI under the lens of reproducibility, by analysing the field for transparency in reporting. Across a total of 325 studies, we found deficiencies in reporting demographics (e.g. age of participants), study design and implementation (e.g. length of interactions), and open data (e.g. maintaining an active code repository). From this analysis, we distill a set of guidelines and provide a checklist to systematically report CRI studies to help and guide research to improve reproducibility in CRI and beyond

    Assessment of the patient’s emotional response with the RobHand rehabilitation platform: A case series study

    Get PDF
    Producción CientíficaCerebrovascular accidents have physical, cognitive and emotional effects. During rehabilitation, the main focus is placed on motor recovery, yet the patient’s emotional state should also be considered. For this reason, validating robotic rehabilitation systems should not only focus on their effectiveness related to the physical recovery but also on the patient’s emotional response. A case series study has been conducted with five stroke patients to assess their emotional response towards therapies using RobHand, a robotic hand rehabilitation platform. Emotional state was evaluated in three dimensions (arousal, valence and dominance) using a computer-based Self-Assessment Manikin (SAM) test. It was verified that the emotions induced by the RobHand platform were successfully distributed in the three-dimensional emotional space. The increase in dominance and the decrease in arousal during sessions reflects that patients had become familiar with the rehabilitation platform, resulting in an increased feeling of control and finding the platform less attractive. The results also reflect that patients found a therapy based on a virtual environment with a realistic scenario more pleasant and attractive.Ministerio de Ciencia e Innovación - (projects PID2019-111023RB-C33 and RTC2019-007350-1

    Virtual reality for safe testing and development in collaborative robotics: challenges and perspectives

    Get PDF
    Collaborative robots (cobots) could help humans in tasks that are mundane, dangerous or where direct human contact carries risk. Yet, the collaboration between humans and robots is severely limited by the aspects of the safety and comfort of human operators. In this paper, we outline the use of extended reality (XR) as a way to test and develop collaboration with robots. We focus on virtual reality (VR) in simulating collaboration scenarios and the use of cobot digital twins. This is specifically useful in situations that are difficult or even impossible to safely test in real life, such as dangerous scenarios. We describe using XR simulations as a means to evaluate collaboration with robots without putting humans at harm. We show how an XR setting enables combining human behavioral data, subjective self-reports, and biosignals signifying human comfort, stress and cognitive load during collaboration. Several works demonstrate XR can be used to train human operators and provide them with augmented reality (AR) interfaces to enhance their performance with robots. We also provide a first attempt at what could become the basis for a human–robot collaboration testing framework, specifically for designing and testing factors affecting human–robot collaboration. The use of XR has the potential to change the way we design and test cobots, and train cobot operators, in a range of applications: from industry, through healthcare, to space operations.info:eu-repo/semantics/publishedVersio

    Human aware robot navigation

    Get PDF
    Abstract. Human aware robot navigation refers to the navigation of a robot in an environment shared with humans in such a way that the humans should feel comfortable, and natural with the presence of the robot. On top of that, the robot navigation should comply with the social norms of the environment. The robot can interact with humans in the environment, such as avoiding them, approaching them, or following them. In this thesis, we specifically focus on the approach behavior of the robot, keeping the other use cases still in mind. Studying and analyzing how humans move around other humans gives us the idea about the kind of navigation behaviors that we expect the robots to exhibit. Most of the previous research does not focus much on understanding such behavioral aspects while approaching people. On top of that, a straightforward mathematical modeling of complex human behaviors is very difficult. So, in this thesis, we proposed an Inverse Reinforcement Learning (IRL) framework based on Guided Cost Learning (GCL) to learn these behaviors from demonstration. After analyzing the CongreG8 dataset, we found that the incoming human tends to make an O-space (circle) with the rest of the group. Also, the approaching velocity slows down when the approaching human gets closer to the group. We utilized these findings in our framework that can learn the optimal reward and policy from the example demonstrations and imitate similar human motion

    A motion control method for a differential drive robot based on human walking for immersive telepresence

    Get PDF
    Abstract. This thesis introduces an interface for controlling Differential Drive Robots (DDRs) for telepresence applications. Our goal is to enhance immersive experience while reducing user discomfort, when using Head Mounted Displays (HMDs) and body trackers. The robot is equipped with a 360° camera that captures the Robot Environment (RE). Users wear an HMD and use body trackers to navigate within a Local Environment (LE). Through a live video stream from the robot-mounted camera, users perceive the RE within a virtual sphere known as the Virtual Environment (VE). A proportional controller was employed to facilitate the control of the robot, enabling to replicate the movements of the user. The proposed method uses chest tracker to control the telepresence robot and focuses on minimizing vection and rotations induced by the robot’s motion by modifying the VE, such as rotating and translating it. Experimental results demonstrate the accuracy of the robot in reaching target positions when controlled through the body-tracker interface. Additionally, it also reveals an optimal VE size that effectively reduces VR sickness and enhances the sense of presence

    Virtual reality check: a comparison of virtual reality, screen-based, and real world settings as research methods for HRI

    Get PDF
    To reduce costs and effort, experiments in human-robot interaction can be carried out in Virtual Reality (VR) or in screen-based (SB) formats. However, it is not well examined whether robots are perceived and experienced in the same way in VR and SB as they are in the physical world. This study addresses this topic in a between-subjects experiment, measuring trust and engagement of an interaction with a mobile service robot in a museum scenario. Measures were made in three different settings, either the real world, in VR or in a game-like SB and then compared with an ANOVA. The results indicate, that neither trust nor engagement differ dependent on the experimental setting. The results imply that both VR and SB are eligible ways to explore the interaction with a mobile service robot, if some peculiarities of each medium are taken into account

    Look me in the eyes: A survey of eye and gaze animation for virtual agents and artificial systems

    Get PDF
    International audienceA person's emotions and state of mind are apparent in their face and eyes. As a Latin proverb states: "The face is the portrait of the mind; the eyes, its informers.". This presents a huge challenge for computer graphics researchers in the generation of artificial entities that aim to replicate the movement and appearance of the human eye, which is so important in human-human interactions. This State of the Art Report provides an overview of the efforts made on tackling this challenging task. As with many topics in Computer Graphics, a cross-disciplinary approach is required to fully understand the workings of the eye in the transmission of information to the user. We discuss the movement of the eyeballs, eyelids, and the head from a physiological perspective and how these movements can be modelled, rendered and animated in computer graphics applications. Further, we present recent research from psychology and sociology that seeks to understand higher level behaviours, such as attention and eye-gaze, during the expression of emotion or during conversation, and how they are synthesised in Computer Graphics and Robotics

    Creepy Technology: What Is It and How Do You Measure It?

    Get PDF
    Interactive technologies are getting closer to our bodies and permeate the infrastructure of our homes. While such technologies offer many benefits, they can also cause an initial feeling of unease in users. It is important for Human-Computer Interaction to manage first impressions and avoid designing technologies that appear creepy. To that end, we developed the Perceived Creepiness of Technology Scale (PCTS), which measures how creepy a technology appears to a user in an initial encounter with a new artefact. The scale was developed based on past work on creepiness and a set of ten focus groups conducted with users from diverse backgrounds. We followed a structured process of analytically developing and validating the scale. The PCTS is designed to enable designers and researchers to quickly compare interactive technologies and ensure that they do not design technologies that produce initial feelings of creepiness in users.Comment: 13 page
    • …
    corecore