2,490 research outputs found

    Artificial Emotion Generation Based on Personality, Mood, and Emotion for Life-Like Facial Expressions of Robots

    Full text link
    International audienceWe can't overemphasize the importance of robot's emotional expressions as robots step into human's daily lives. So, the believable and socially acceptable emotional expressions of robots are essential. For such human-like emotional expression, we have proposed an emotion generation model considering personality, mood and history of robot's emotion. The personality module is based on the Big Five Model (OCEAN Model, Five Factor Model); the mood module has one dimension such as good or bad, and the emotion module uses the six basic emotions as defined by Ekman. Unlike most of the previous studies, the proposed emotion generation model was integrated with the Linear Dynamic Affect Expression Model (LDAEM), which is an emotional expression model that can make facial expressions similar to those of humans. So, both the emotional state and expression of robots can be changed dynamically

    Affect and believability in game characters:a review of the use of affective computing in games

    Get PDF
    Virtual agents are important in many digital environments. Designing a character that highly engages users in terms of interaction is an intricate task constrained by many requirements. One aspect that has gained more attention recently is the effective dimension of the agent. Several studies have addressed the possibility of developing an affect-aware system for a better user experience. Particularly in games, including emotional and social features in NPCs adds depth to the characters, enriches interaction possibilities, and combined with the basic level of competence, creates a more appealing game. Design requirements for emotionally intelligent NPCs differ from general autonomous agents with the main goal being a stronger player-agent relationship as opposed to problem solving and goal assessment. Nevertheless, deploying an affective module into NPCs adds to the complexity of the architecture and constraints. In addition, using such composite NPC in games seems beyond current technology, despite some brave attempts. However, a MARPO-type modular architecture would seem a useful starting point for adding emotions

    A systematic comparison of affective robot expression modalities

    Get PDF

    Emotion Transfer from Frontline Social Robots to Human Customers During Service Encounters: Testing an Artificial Emotional Contagion Modell

    Get PDF
    This research examines mood transitions during human-robot interactions (HRI) compared with human-human interactions (HHI) during service encounters. Based on emotional contagion and social identity theory, we argue that emotion transmission within HRI (e.g., between a frontline service robot and a human customer) may occur through the imitation of the robot’s verbal and bodily expressions by the customer and may be stronger for negative than for positive emotions. The customer’s positive attitude and anxiety toward robots will further be examined as contingencies that strengthen or weaken the emotion transition during the HRI. We already identified the five most important emotions during service encounters (critical incident study with 131 frontline employees). The subsequent output behavior was programmed to a Nao robot and validated (ratings from 234 students). In the next step, we attempt to manipulate the emotional expressions of a frontline social robot and a customer within an experimental study

    Emotion and mood blending in embodied artificial agents: expressing affective states in the mini social robot

    Get PDF
    Robots that are devised for assisting and interacting with humans are becoming fundamental in many applications, including in healthcare, education, and entertainment. For these robots, the capacity to exhibit affective states plays a crucial role in creating emotional bonding with the user. In this work, we present an affective architecture that grounds biological foundations to shape the affective state of the Mini social robot in terms of mood and emotion blending. The affective state depends upon the perception of stimuli in the environment, which influence how the robot behaves and affectively communicates with other peers. According to research in neuroscience, mood typically rules our affective state in the long run, while emotions do it in the short term, although both processes can overlap. Consequently, the model that is presented in this manuscript deals with emotion and mood blending towards expressing the robot's internal state to the users. Thus, the primary novelty of our affective model is the expression of: (i) mood, (ii) punctual emotional reactions to stimuli, and (iii) the decay that mood and emotion undergo with time. The system evaluation explored whether users can correctly perceive the mood and emotions that the robot is expressing. In an online survey, users evaluated the robot's expressions showing different moods and emotions. The results reveal that users could correctly perceive the robot's mood and emotion. However, emotions were more easily recognized, probably because they are more intense affective states and mainly arise as a stimuli reaction. To conclude the manuscript, a case study shows how our model modulates Mini's expressiveness depending on its affective state during a human-robot interaction scenario.The research leading to these results has received funding from the projects Robots sociales para estimulación física, cognitiva y afectiva de mayores (ROSES) RTI2018-096338-B-I00 funded by Agencia Estatal de Investigación (AEI), Ministerio de Ciencia, Innovación y Universidades and RoboCity2030-DIH-CM, Madrid Robotics Digital Innovation Hub, S2018/NMT-4331, funded by "Programas de Actividades I+D en la Comunidad de Madrid" and cofunded by Structural Funds of the EU. Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature

    Affective Computing

    Get PDF
    This book provides an overview of state of the art research in Affective Computing. It presents new ideas, original results and practical experiences in this increasingly important research field. The book consists of 23 chapters categorized into four sections. Since one of the most important means of human communication is facial expression, the first section of this book (Chapters 1 to 7) presents a research on synthesis and recognition of facial expressions. Given that we not only use the face but also body movements to express ourselves, in the second section (Chapters 8 to 11) we present a research on perception and generation of emotional expressions by using full-body motions. The third section of the book (Chapters 12 to 16) presents computational models on emotion, as well as findings from neuroscience research. In the last section of the book (Chapters 17 to 22) we present applications related to affective computing

    Factors of Emotion and Affect in Designing Interactive Virtual Characters

    Get PDF
    The Arts: 1st Place (The Ohio State University Edward F. Hayes Graduate Research Forum)This paper represents a review of literature concerning factors of affective interactive virtual character design. Affect and it's related concepts are defined followed by a detail of work being conducted in relevant areas such as design, animation, robotics. The intent of this review as to inform the author on overlapping concepts in fields related to affective design in order to apply these concepts interactive character development.A three-year embargo was granted for this item

    Affective Communication for Socially Assistive Robots (SARs) for Children with Autism Spectrum Disorder: A Systematic Review

    Get PDF
    Research on affective communication for socially assistive robots has been conducted to enable physical robots to perceive, express, and respond emotionally. However, the use of affective computing in social robots has been limited, especially when social robots are designed for children, and especially those with autism spectrum disorder (ASD). Social robots are based on cognitiveaffective models, which allow them to communicate with people following social behaviors and rules. However, interactions between a child and a robot may change or be different compared to those with an adult or when the child has an emotional deficit. In this study, we systematically reviewed studies related to computational models of emotions for children with ASD. We used the Scopus, WoS, Springer, and IEEE-Xplore databases to answer different research questions related to the definition, interaction, and design of computational models supported by theoretical psychology approaches from 1997 to 2021. Our review found 46 articles; not all the studies considered children or those with ASD.This research was funded by VRIEA-PUCV, grant number 039.358/202
    corecore