1,037 research outputs found

    Development of artificial empathy

    Get PDF
    AbstractWe have been advocating cognitive developmental robotics to obtain new insight into the development of human cognitive functions by utilizing synthetic and constructive approaches. Among the different emotional functions, empathy is difficult to model, but essential for robots to be social agents in our society. In my previous review on artificial empathy (Asada, 2014b), I proposed a conceptual model for empathy development beginning with emotional contagion to envy/schadenfreude along with self/other differentiation. In this article, the focus is on two aspects of this developmental process, emotional contagion in relation to motor mimicry, and cognitive/affective aspects of the empathy. It begins with a summary of the previous review (Asada, 2014b) and an introduction to affective developmental robotics as a part of cognitive developmental robotics focusing on the affective aspects. This is followed by a review and discussion on several approaches for two focused aspects of affective developmental robotics. Finally, future issues involved in the development of a more authentic form of artificial empathy are discussed

    Robotic motion learning framework to promote social engagement

    Get PDF
    Abstract Imitation is a powerful component of communication between people, and it poses an important implication in improving the quality of interaction in the field of human–robot interaction (HRI). This paper discusses a novel framework designed to improve human–robot interaction through robotic imitation of a participant’s gestures. In our experiment, a humanoid robotic agent socializes with and plays games with a participant. For the experimental group, the robot additionally imitates one of the participant’s novel gestures during a play session. We hypothesize that the robot’s use of imitation will increase the participant’s openness towards engaging with the robot. Experimental results from a user study of 12 subjects show that post-imitation, experimental subjects displayed a more positive emotional state, had higher instances of mood contagion towards the robot, and interpreted the robot to have a higher level of autonomy than their control group counterparts did. These results point to an increased participant interest in engagement fueled by personalized imitation during interaction

    Modeling and Design Analysis of Facial Expressions of Humanoid Social Robots Using Deep Learning Techniques

    Get PDF
    abstract: A lot of research can be seen in the field of social robotics that majorly concentrate on various aspects of social robots including design of mechanical parts and their move- ment, cognitive speech and face recognition capabilities. Several robots have been developed with the intention of being social, like humans, without much emphasis on how human-like they actually look, in terms of expressions and behavior. Fur- thermore, a substantial disparity can be seen in the success of results of any research involving ”humanizing” the robots’ behavior, or making it behave more human-like as opposed to research into biped movement, movement of individual body parts like arms, fingers, eyeballs, or human-like appearance itself. The research in this paper in- volves understanding why the research on facial expressions of social humanoid robots fails where it is not accepted completely in the current society owing to the uncanny valley theory. This paper identifies the problem with the current facial expression research as information retrieval problem. This paper identifies the current research method in the design of facial expressions of social robots, followed by using deep learning as similarity evaluation technique to measure the humanness of the facial ex- pressions developed from the current technique and further suggests a novel solution to the facial expression design of humanoids using deep learning.Dissertation/ThesisMasters Thesis Computer Science 201

    Robot NAO used in therapy: Advanced design and evaluation

    Get PDF
    Treball de Final de Màster Universitari en Sistemes Intel·ligents. Codi: SIE043. Curs acadèmic 2013-2014Following with the previous work which we have done in the Final Research Project, we introduced a therapeutic application with social robotics to improve the positive mood in patients with fibromyalgia. Different works about therapeutic robotics, positive psychology, emotional intelligence, social learning and mood induction procedures (MIPs) are reviewed. Hardware and software requirements and system development are explained with detail. Conclusions about the clinical utility of these robots are disputed. Nowadays, experiments with real fibromyalgia patients are running, the methodology and procedures which take place in them are described in the future lines section of this work

    Becoming Human with Humanoid

    Get PDF
    Nowadays, our expectations of robots have been significantly increases. The robot, which was initially only doing simple jobs, is now expected to be smarter and more dynamic. People want a robot that resembles a human (humanoid) has and has emotional intelligence that can perform action-reaction interactions. This book consists of two sections. The first section focuses on emotional intelligence, while the second section discusses the control of robotics. The contents of the book reveal the outcomes of research conducted by scholars in robotics fields to accommodate needs of society and industry

    Emotional design and human-robot interaction

    Get PDF
    Recent years have shown an increase in the importance of emotions applied to the Design field - Emotional Design. In this sense, the emotional design aims to elicit (e.g., pleasure) or prevent (e.g., displeasure) determined emotions, during human product interaction. That is, the emotional design regulates the emotional interaction between the individual and the product (e.g., robot). Robot design has been a growing area whereby robots are interacting directly with humans in which emotions are essential in the interaction. Therefore, this paper aims, through a non-systematic literature review, to explore the application of emotional design, particularly on Human-Robot Interaction. Robot design features (e.g., appearance, expressing emotions and spatial distance) that affect emotional design are introduced. The chapter ends with a discussion and a conclusion.info:eu-repo/semantics/acceptedVersio

    The Effects of Robot Voices and Appearances on Users\u27 Emotion Recognition and Subjective Perception

    Get PDF
    As the influence of social robots in people\u27s daily lives grows, research on understanding people\u27s perception of robots including sociability, trust, acceptance, and preference becomes more pervasive. Research has considered visual, vocal, or tactile cues to express robots\u27 emotions, whereas little research has provided a holistic view in examining the interactions among different factors influencing emotion perception. We investigated multiple facets of user perception on robots during a conversational task by varying the robots\u27 voice types, appearances, and emotions. In our experiment, 20 participants interacted with two robots having four different voice types. While participants were reading fairy tales to the robot, the robot gave vocal feedback with seven emotions and the participants evaluated the robot\u27s profiles through post surveys. The results indicate that (1) the accuracy of emotion perception differed depending on presented emotions, (2) a regular human voice showed higher user preferences and naturalness, (3) but a characterized voice was more appropriate for expressing emotions with significantly higher accuracy in emotion perception, and (4) participants showed significantly higher emotion recognition accuracy with the animal robot than the humanoid robot. A follow-up study (N=10) with voice-only conditions confirmed that the importance of embodiment. The results from this study could provide the guidelines needed to design social robots that consider emotional aspects in conversations between robots and users

    Design of Robot Head for Expression of Human Emotion

    Get PDF
    Abstract. Humanoid robot is a type of robot which designed in human-form with the purpose to increase the quality of human life. The key features of humanoid robot are to perform human-like behaviours and to undergo effective interaction with human-operator. Facial expressions play an important role in natural human-robot communication as human communication in daily life relies on face-to-face communication. The purpose of this study was to develop an interactive robot head that able to express six basic human emotions based on Ekman's model which are joy, sadness, anger, disgust, surprise and fear. The combination of action units based on different control point on robot head was proposed in this study. The new robot head provided with 11-DoFs to perform different expression in human-like way. A survey was conducted on twelve sets of emotion design drawn by using Solidworks. Evaluation had been done on each design for its expression ability and the best design of emotion to implement on the robot head was obtained in the end of survey. Hardware experiment was conducted to control the LCD display and position of servo motor by using Arduino Leonardo as the controller for the robot head system. Additionally, a keypad controller was designed to control the expression of robot head based on the control from user. The controller is connected with LCD display to show the name of facial expression for the learning purpose of autism children. This project focuses on the performance test of robot head in term of position accuracy for the 11 actuators used in robot head construction. The result shows that the relative position error for each robot head parts was less than 10% and thus robot head able to perform the emotion effectively. The survey on the recognition rate for each emotion expression was conducted individually to 100 respondents. The recognition rate obtained for the six emotions express by robot head was more than 70% recognition rate for each expression shown by robot head, which means more than 70 respondents voted for each expression

    Shaping Robot Gestures to Shape Users' Perception: the Effect of Amplitude and Speed on Godspeed Ratings

    Get PDF
    This work analyses the relationship between the way robots gesture and the way those gestures are perceived by human users. In particular, this work shows how modifying the amplitude and speed of a gesture affect the Godspeed scores given to those gestures, by means of an experiment involving 45 stimuli and 30 observers. The results suggest that shaping gestures aimed at manifesting the inner state of the robot (e.g., cheering or showing disappointment) tends to change the perception of Animacy (the dimension that accounts for how driven by endogenous factors the robot is perceived to be), while shaping gestures aimed at achieving an interaction effect (e.g., engaging and disengaging) tends to change the perception of Anthropomorphism, Likeability and Perceived Safety (the dimensions that account for the social aspects of the perception)

    Design of a Huggable Social Robot with Affective Expressions Using Projected Images

    Get PDF
    We introduce Pepita, a caricatured huggable robot capable of sensing and conveying affective expressions by means of tangible gesture recognition and projected avatars. This study covers the design criteria, implementation and performance evaluation of the different characteristics of the form and function of this robot. The evaluation involves: (1) the exploratory study of the different features of the device, (2) design and performance evaluation of sensors for affective interaction employing touch, and (3) design and implementation of affective feedback using projected avatars. Results showed that the hug detection worked well for the intended application and the affective expressions made with projected avatars were appropriated for this robot. The questionnaires analyzing users’ perception provide us with insights to guide the future designs of similar interfaces
    corecore