723 research outputs found

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    How Do You Like Me in This: User Embodiment Preferences for Companion Agents

    Get PDF
    We investigate the relationship between the embodiment of an artificial companion and user perception and interaction with it. In a Wizard of Oz study, 42 users interacted with one of two embodiments: a physical robot or a virtual agent on a screen through a role-play of secretarial tasks in an office, with the companion providing essential assistance. Findings showed that participants in both condition groups when given the choice would prefer to interact with the robot companion, mainly for its greater physical or social presence. Subjects also found the robot less annoying and talked to it more naturally. However, this preference for the robotic embodiment is not reflected in the users’ actual rating of the companion or their interaction with it. We reflect on this contradiction and conclude that in a task-based context a user focuses much more on a companion’s behaviour than its embodiment. This underlines the feasibility of our efforts in creating companions that migrate between embodiments while maintaining a consistent identity from the user’s point of view

    A Novel Reinforcement-Based Paradigm for Children to Teach the Humanoid Kaspar Robot

    Get PDF
    © The Author(s) 2019. This is the final published version of an article published in Psychological Research, licensed under a Creative Commons Attri-bution 4.0 International License. Available online at: https://doi.org/10.1007/s12369-019-00607-xThis paper presents a contribution to the active field of robotics research with the aim of supporting the development of social and collaborative skills of children with Autism Spectrum Disorders (ASD). We present a novel experiment where the classical roles are reversed: in this scenario the children are the teachers providing positive or negative reinforcement to the Kaspar robot in order for the robot to learn arbitrary associations between different toy names and the locations where they are positioned. The objective of this work is to develop games which help children with ASD develop collaborative skills and also provide them tangible example to understand that sometimes learning requires several repetitions. To facilitate this game we developed a reinforcement learning algorithm enabling Kaspar to verbally convey its level of uncertainty during the learning process, so as to better inform the children interacting with Kaspar the reasons behind the successes and failures made by the robot. Overall, 30 Typically Developing (TD) children aged between 7 and 8 (19 girls, 11 boys) and 6 children with ASD performed 22 sessions (16 for TD; 6 for ASD) of the experiment in groups, and managed to teach Kaspar all associations in 2 to 7 trials. During the course of study Kaspar only made rare unexpected associations (2 perseverative errors and 1 win-shift, within a total of 272 trials), primarily due to exploratory choices, and eventually reached minimal uncertainty. Thus the robot's behavior was clear and consistent for the children, who all expressed enthusiasm in the experiment.Peer reviewe

    Design Research on Robotic Products for School Environments

    Get PDF
    Advancements in robotic research have led to the design of a number of robotic products that can interact with people. In this research, a school environment was selected for a practical test of robotic products. For this, the robot “Tiro” was built, with the aim of supporting the learning activities of children. The possibility of applying robotic products was then tested through example lessons using Tiro. To do this, the robot design process and user-centred HRI evaluation framework were studied, and observations of robotic products were made via a field study on the basis of these understandings. Three different field studies were conducted, and interactions between children and robotic products were investigated. As a result, it was possible to understand how emotional interaction and verbal interaction affect the development of social relationships. Early results regarding this and coding schemes for video protocol analysis were gained. In this preliminary study, the findings are summarized and several design implications from insight grouping are suggested. These will help robot designers grasp how various factors of robotic products may be adopted in the everyday lives of people. Keywords: Robotic Products Design, HRI Evaluation, User-Centered HRI.</p

    Social Robots in Hospitals: A Systematic Review

    Full text link
    Hospital environments are facing new challenges this century. One of the most important is the quality of services to patients. Social robots are gaining prominence due to the advantages they offer; in particular, several of their main uses have proven beneficial during the pandemic. This study aims to shed light on the current status of the design of social robots and their interaction with patients. To this end, a systematic review was conducted using WoS and MEDLINE, and the results were exhaustive analyzed. The authors found that most of the initiatives and projects serve the el- derly and children, and specifically, that they helped these groups fight diseases such as dementia, autism spectrum disorder (ASD), cancer, and diabetes

    A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    Get PDF
    Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed
    • …
    corecore