1,289 research outputs found

    Reverse Engineering Psychologically Valid Facial Expressions of Emotion into Social Robots

    Get PDF
    Social robots are now part of human society, destined for schools, hospitals, and homes to perform a variety of tasks. To engage their human users, social robots must be equipped with the essential social skill of facial expression communication. Yet, even state-of-the-art social robots are limited in this ability because they often rely on a restricted set of facial expressions derived from theory with well-known limitations such as lacking naturalistic dynamics. With no agreed methodology to objectively engineer a broader variance of more psychologically impactful facial expressions into the social robots' repertoire, human-robot interactions remain restricted. Here, we address this generic challenge with new methodologies that can reverse-engineer dynamic facial expressions into a social robot head. Our data-driven, user-centered approach, which combines human perception with psychophysical methods, produced highly recognizable and human-like dynamic facial expressions of the six classic emotions that generally outperformed state-of-art social robot facial expressions. Our data demonstrates the feasibility of our method applied to social robotics and highlights the benefits of using a data-driven approach that puts human users as central to deriving facial expressions for social robots. We also discuss future work to reverse-engineer a wider range of socially relevant facial expressions including conversational messages (e.g., interest, confusion) and personality traits (e.g., trustworthiness, attractiveness). Together, our results highlight the key role that psychology must continue to play in the design of social robots

    Equipping Social Robots with Culturally-Sensitive Facial Expressions of Emotion Using Data-Driven Methods

    Get PDF
    Social robots must be able to generate realistic and recognizable facial expressions to engage their human users. Many social robots are equipped with standardized facial expressions of emotion that are widely considered to be universally recognized across all cultures. However, mounting evidence shows that these facial expressions are not universally recognized - for example, they elicit significantly lower recognition accuracy in East Asian cultures than they do in Western cultures. Therefore, without culturally sensitive facial expressions, state-of-the-art social robots are restricted in their ability to engage a culturally diverse range of human users, which in turn limits their global marketability. To develop culturally sensitive facial expressions, novel data-driven methods are used to model the dynamic face movement patterns that convey basic emotions (e.g., happy, sad, anger) in a given culture using cultural perception. Here, we tested whether such dynamic facial expression models, derived in an East Asian culture and transferred to a popular social robot, improved the social signalling generation capabilities of the social robot with East Asian participants. Results showed that, compared to the social robot's existing set of facial `universal' expressions, the culturally-sensitive facial expression models are recognized with generally higher accuracy and judged as more human-like by East Asian participants. We also detail the specific dynamic face movements (Action Units) that are associated with high recognition accuracy and judgments of human-likeness, including those that further boost performance. Our results therefore demonstrate the utility of using data-driven methods that employ human cultural perception to derive culturally-sensitive facial expressions that improve the social face signal generation capabilities of social robots. We anticipate that these methods will continue to inform the design of social robots and broaden their usability and global marketability

    Social Robots in Hospitals: A Systematic Review

    Full text link
    Hospital environments are facing new challenges this century. One of the most important is the quality of services to patients. Social robots are gaining prominence due to the advantages they offer; in particular, several of their main uses have proven beneficial during the pandemic. This study aims to shed light on the current status of the design of social robots and their interaction with patients. To this end, a systematic review was conducted using WoS and MEDLINE, and the results were exhaustive analyzed. The authors found that most of the initiatives and projects serve the el- derly and children, and specifically, that they helped these groups fight diseases such as dementia, autism spectrum disorder (ASD), cancer, and diabetes

    A Mobile Robot Generating Video Summaries of Seniors' Indoor Activities

    Full text link
    We develop a system which generates summaries from seniors' indoor-activity videos captured by a social robot to help remote family members know their seniors' daily activities at home. Unlike the traditional video summarization datasets, indoor videos captured from a moving robot poses additional challenges, namely, (i) the video sequences are very long (ii) a significant number of video-frames contain no-subject or with subjects at ill-posed locations and scales (iii) most of the well-posed frames contain highly redundant information. To address this problem, we propose to \hl{exploit} pose estimation \hl{for detecting} people in frames\hl{. This guides the robot} to follow the user and capture effective videos. We use person identification to distinguish a target senior from other people. We \hl{also make use of} action recognition to analyze seniors' major activities at different moments, and develop a video summarization method to select diverse and representative keyframes as summaries.Comment: accepted by MobileHCI'1

    Emotion-Oriented Behavior Model Using Deep Learning

    Full text link
    Emotions, as a fundamental ingredient of any social interaction, lead to behaviors that represent the effectiveness of the interaction through facial expressions and gestures in humans. Hence an agent must possess the social and cognitive abilities to understand human social parameters and behave accordingly. However, no such emotion-oriented behavior model is presented yet in the existing research. The emotion prediction may generate appropriate agents' behaviors for effective interaction using conversation modality. Considering the importance of emotions, and behaviors, for an agent's social interaction, an Emotion-based Behavior model is presented in this paper for Socio-cognitive artificial agents. The proposed model is implemented using tweets data trained on multiple models like Long Short-Term Memory (LSTM), Convolution Neural Network (CNN) and Bidirectional Encoder Representations from Transformers (BERT) for emotion prediction with an average accuracy of 92%, and 55% respectively. Further, using emotion predictions from CNN-LSTM, the behavior module responds using facial expressions and gestures using Behavioral Markup Language (BML). The accuracy of emotion-based behavior predictions is statistically validated using the 2-tailed Pearson correlation on the data collected from human users through questionnaires. Analysis shows that all emotion-based behaviors accurately depict human-like gestures and facial expressions based on the significant correlation at the 0.01 and 0.05 levels. This study is a steppingstone to a multi-faceted artificial agent interaction based on emotion-oriented behaviors. Cognition has significance regarding social interaction among humans

    The influence of facial blushing and paling on emotion perception and memory

    Get PDF
    Emotion expressions facilitate interpersonal communication by conveying information about a person’s affective state. The current work investigates how facial coloration (i.e., subtle changes in chromaticity from baseline facial color) impacts the perception of, and memory for, emotion expressions, and whether these depend on dynamic (vs. static) representations of emotional behavior. Emotion expressive stimuli that either did or did not vary in facial coloration were shown to participants who were asked to categorize and rate the stimuli’s intensity (Exps. 1 & 2), as well as recall their degree of facial coloration (Exps. 3 & 4). Results showed that changes in facial coloration facilitated emotion categorization accuracy in dynamic (Exp. 1) but not static expressions (Exp. 2). Facial coloration further increased perceived emotion intensity, with participants misremembering the coloration of both dynamic and static expressions differently depending on emotion category prototype (Exps. 3 & 4). Together, these findings indicate that facial coloration conveys affective information to observers and contributes to biases in how emotion expressions are perceived and remembered

    Social Robots in Hospitals: A Systematic Review

    Get PDF
    Hospital environments are facing new challenges this century. One of the most important is the quality of services to patients. Social robots are gaining prominence due to the advantages they offer; in particular, several of their main uses have proven beneficial during the pandemic. This study aims to shed light on the current status of the design of social robots and their interaction with patients. To this end, a systematic review was conducted using WoS and MEDLINE, and the results were exhaustive analyzed. The authors found that most of the initiatives and projects serve the elderly and children, and specifically, that they helped these groups fight diseases such as dementia, autism spectrum disorder (ASD), cancer, and diabetes.This work has been supported by the PERGAMEX ACTIVE project, Ref. RTI2018-096986- B-C32, funded by the Spanish Ministry of Science and Innovation

    ON THE INFLUENCE OF SOCIAL ROBOTS IN COGNITIVE MULTITASKING AND ITS APPLICATION

    Get PDF
    [Objective] I clarify the impact of social robots on cognitive tasks, such as driving a car or driving an airplane, and show the possibility of industrial applications based on the principles of social robotics. [Approach] I adopted the MATB, a generalized version of the automobile and airplane operation tasks, as cognitive tasks to evaluate participants' performance on reaction speed, tracking performance, and short-term memory tasks that are widely applicable, rather than tasks specific to a particular situation. Also, as the stimuli from social robots, we used the iCub robot, which has been widely used in social communication research. In the analysis of participants, I not only analyzed performance, but also mental workload using skin conductance and emotional analysis of arousal-valence using facial expressions analysis. In the first experiment, I compared a social robot that use social signals with a nonsocial robot that do not use such signals and evaluated whether social robots affect cognitive task performances. In the second experiment, I focused on vitality forms and compared a calm social robot with an assertive social robot. As analysis methods, I adopted Mann-Whitney's U test for one-pair comparisons, and ART-ANOVA for analysis of variance in repeated task comparisons. Based on the results, I aimed to express vitality forms in a robot head, which is smaller in size and more flexible in placement than a full-body humanoid robot, considering car and airplane cockpit's limited space. For that, I developed a novel eyebrow and I decided to use a wire-driven technique, which is widely used in surgical robots to control soft materials. [Main results] In cognitive tasks such as car drivers and airplane pilots, I clarified the effects of social robots acting social behaviors on task performance, mental workload, and emotions. In addition, I focused on vitality forms, one of the parameters of social behaviors, and clarified the effects of different vitality forms of social robots' behavior on cognitive tasks.In cognitive tasks such as car drivers and airplane pilots, we clarified the effects of social robots acting in social behaviors on task performance, mental workload, and emotions, and showed that the presence of social robots can be effective in cognitive tasks. Furthermore, focusing on vitality forms, one of the parameters of social behaviors, we clarified the effects of different vitality forms of social robots' behaviors on cognitive tasks, and found that social robots with calm behaviors positively affected participants' facial expressions and improved their performance in a short-term memory task. Based on the results, I decided to adopt the configuration of a robot head, eliminating the torso from the social humanoid robot, iCub, considering the possibility of placement in a limited space such as cockpits of car or airplane. In designing the robot head, I developed a novel soft-material eyebrow that can be mounted on the iCub robot head to achieve continuous position and velocity changes, which is an important factor to express vitality forms. The novel eyebrows can express different vitality forms by changing the shape and velocity of the eyebrows, which was conventionally represented by the iCub's torso and arms. [Significance] The results of my research are important achievements that opens up the possibility of applying social robots to non-robotic industries such as automotive and aircraft. In addition, the newly developed soft-material eyebrows' precise shape and velocity changes have opened up new research possibilities in social robotics and social communication research themselves, enabling experiments with complex facial expressions that move beyond Ekman's simple facial expression changes definition, such as, joy, anger, sadness, and pleasure. Thus, the results of this research are one important step in both scientific and industrial applications. [Key-words] social robot, cognitive task, vitality form, robot head, facial expression, eyebro
    • …
    corecore