14,800 research outputs found

    Dialogue Design for a Robot-Based Face-Mirroring Game to Engage Autistic Children with Emotional Expressions

    Get PDF
    We present design strategies for Human Robot Interaction for school-aged autistic children with limited receptive language. Applying these strategies to the DE-ENIGMA project (large EU project addressing emotion recognition in autistic children) supported development of a new activity for in facial expression imitation whereby the robot imitates the child’s face to encourage the child to notice facial expressions in a play-based game. A usability case study with 15 typically-developing children aged 4–6 at an English-language school in the Netherlands was performed to observe the feasibility of the setup and make design revisions before exposing the robot to autistic children

    Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods

    Get PDF
    Social robots must be able to generate realistic and recognizable facial expressions to engage their human users. Many social robots are equipped with standardized facial expressions of emotion that are widely considered to be universally recognized across all cultures. However, mounting evidence shows that these facial expressions are not universally recognized - for example, they elicit significantly lower recognition accuracy in East Asian cultures than they do in Western cultures. Therefore, without culturally sensitive facial expressions, state-of-the-art social robots are restricted in their ability to engage a culturally diverse range of human users, which in turn limits their global marketability. To develop culturally sensitive facial expressions, novel data-driven methods are used to model the dynamic face movement patterns that convey basic emotions (e.g., happy, sad, anger) in a given culture using cultural perception. Here, we tested whether such dynamic facial expression models, derived in an East Asian culture and transferred to a popular social robot, improved the social signalling generation capabilities of the social robot with East Asian participants. Results showed that, compared to the social robot's existing set of facial `universal' expressions, the culturally-sensitive facial expression models are recognized with generally higher accuracy and judged as more human-like by East Asian participants. We also detail the specific dynamic face movements (Action Units) that are associated with high recognition accuracy and judgments of human-likeness, including those that further boost performance. Our results therefore demonstrate the utility of using data-driven methods that employ human cultural perception to derive culturally-sensitive facial expressions that improve the social face signal generation capabilities of social robots. We anticipate that these methods will continue to inform the design of social robots and broaden their usability and global marketability

    Design of Robot Head for Expression of Human Emotion

    Get PDF
    Abstract. Humanoid robot is a type of robot which designed in human-form with the purpose to increase the quality of human life. The key features of humanoid robot are to perform human-like behaviours and to undergo effective interaction with human-operator. Facial expressions play an important role in natural human-robot communication as human communication in daily life relies on face-to-face communication. The purpose of this study was to develop an interactive robot head that able to express six basic human emotions based on Ekman's model which are joy, sadness, anger, disgust, surprise and fear. The combination of action units based on different control point on robot head was proposed in this study. The new robot head provided with 11-DoFs to perform different expression in human-like way. A survey was conducted on twelve sets of emotion design drawn by using Solidworks. Evaluation had been done on each design for its expression ability and the best design of emotion to implement on the robot head was obtained in the end of survey. Hardware experiment was conducted to control the LCD display and position of servo motor by using Arduino Leonardo as the controller for the robot head system. Additionally, a keypad controller was designed to control the expression of robot head based on the control from user. The controller is connected with LCD display to show the name of facial expression for the learning purpose of autism children. This project focuses on the performance test of robot head in term of position accuracy for the 11 actuators used in robot head construction. The result shows that the relative position error for each robot head parts was less than 10% and thus robot head able to perform the emotion effectively. The survey on the recognition rate for each emotion expression was conducted individually to 100 respondents. The recognition rate obtained for the six emotions express by robot head was more than 70% recognition rate for each expression shown by robot head, which means more than 70 respondents voted for each expression

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots

    Get PDF
    Social robots are now part of human society, destined for schools, hospitals, and homes to perform a variety of tasks. To engage their human users, social robots must be equipped with the essential social skill of facial expression communication. Yet, even state-of-the-art social robots are limited in this ability because they often rely on a restricted set of facial expressions derived from theory with well-known limitations such as lacking naturalistic dynamics. With no agreed methodology to objectively engineer a broader variance of more psychologically impactful facial expressions into the social robots' repertoire, human-robot interactions remain restricted. Here, we address this generic challenge with new methodologies that can reverse-engineer dynamic facial expressions into a social robot head. Our data-driven, user-centered approach, which combines human perception with psychophysical methods, produced highly recognizable and human-like dynamic facial expressions of the six classic emotions that generally outperformed state-of-art social robot facial expressions. Our data demonstrates the feasibility of our method applied to social robotics and highlights the benefits of using a data-driven approach that puts human users as central to deriving facial expressions for social robots. We also discuss future work to reverse-engineer a wider range of socially relevant facial expressions including conversational messages (e.g., interest, confusion) and personality traits (e.g., trustworthiness, attractiveness). Together, our results highlight the key role that psychology must continue to play in the design of social robots

    Systems overview of Ono: a DIY reproducible open source social robot

    Get PDF
    One of the major obstacles in the study of HRI (human-robot interaction) with social robots is the lack of multiple identical robots that allow testing with large user groups. Often, the price of these robots prohibits using more than a handful. A lot of the commercial robots do not possess all the necessary features to perform specific HRI experiments and due to the closed nature of the platform, large modifications are nearly impossible. While open source social robots do exist, they often use high-end components and expensive manufacturing techniques, making them unsuitable for easy reproduction. To address this problem, a new social robotics platform, named Ono, was developed. The design is based on the DIY mindset of the maker movement, using off-the-shelf components and more accessible rapid prototyping and manufacturing techniques. The modular structure of the robot makes it easy to adapt to the needs of the experiment and by embracing the open source mentality, the robot can be easily reproduced or further developed by a community of users. The low cost, open nature and DIY friendliness of the robot make it an ideal candidate for HRI studies that require a large user group
    • …
    corecore