11,135 research outputs found

    The Intensity of Perceived Emotions in 3D Virtual Humans

    Get PDF
    Synthetically generated 3D humans often fail to express a full range of emotions or present different levels of the same type of emotion. Transcending the facial expression, what should a happy synthetically generated human look like? What about a slightly happy or ecstatically happy? This paper reports a study aimed at identifying the appropriate bodily expressions for various emotions in 3D human-like figures at varying emotional strength. Thirty-six volunteers were asked to discriminate and categorize thirty cards with static poses of 3D human-like characters into the Ekman’s six basic categories of emotions. This is to judge the compatibility of each posture in relation to each category and to rate their level of emotion within the group

    Dynamic Facial Expression of Emotion Made Easy

    Full text link
    Facial emotion expression for virtual characters is used in a wide variety of areas. Often, the primary reason to use emotion expression is not to study emotion expression generation per se, but to use emotion expression in an application or research project. What is then needed is an easy to use and flexible, but also validated mechanism to do so. In this report we present such a mechanism. It enables developers to build virtual characters with dynamic affective facial expressions. The mechanism is based on Facial Action Coding. It is easy to implement, and code is available for download. To show the validity of the expressions generated with the mechanism we tested the recognition accuracy for 6 basic emotions (joy, anger, sadness, surprise, disgust, fear) and 4 blend emotions (enthusiastic, furious, frustrated, and evil). Additionally we investigated the effect of VC distance (z-coordinate), the effect of the VC's face morphology (male vs. female), the effect of a lateral versus a frontal presentation of the expression, and the effect of intensity of the expression. Participants (n=19, Western and Asian subjects) rated the intensity of each expression for each condition (within subject setup) in a non forced choice manner. All of the basic emotions were uniquely perceived as such. Further, the blends and confusion details of basic emotions are compatible with findings in psychology

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Drivers’ behaviour modelling for virtual worlds

    Get PDF
    In this paper we present a study that looks at modelling drivers’ behaviour with a view to contribute to the problem of road rage. The approach we adopt is based on agent technology, particularly multi-agent systems. Each driver is represented by a software agent. A virtual environment is used to simulate drivers’ behaviour, thus enabling us to observe the conditions leading to road rage. The simulated model is then used to suggest possible ways of alleviating this societal problem. Our agents are equipped with an emotional module which will make their behaviours more human-like. For this, we propose a computational emotion model based on the OCC model and probabilistic cognitive maps. The key influencing factors that are included in the model are personality, emotions and some social/personal attributes

    Towards the improvement of self-service systems via emotional virtual agents

    Get PDF
    Affective computing and emotional agents have been found to have a positive effect on human-computer interactions. In order to develop an acceptable emotional agent for use in a self-service interaction, two stages of research were identified and carried out; the first to determine which facial expressions are present in such an interaction and the second to determine which emotional agent behaviours are perceived as appropriate during a problematic self-service shopping task. In the first stage, facial expressions associated with negative affect were found to occur during self-service shopping interactions, indicating that facial expression detection is suitable for detecting negative affective states during self-service interactions. In the second stage, user perceptions of the emotional facial expressions displayed by an emotional agent during a problematic self-service interaction were gathered. Overall, the expression of disgust was found to be perceived as inappropriate while emotionally neutral behaviour was perceived as appropriate, however gender differences suggested that females perceived surprise as inappropriate. Results suggest that agents should change their behaviour and appearance based on user characteristics such as gender

    Toward a social psychophysics of face communication

    Get PDF
    As a highly social species, humans are equipped with a powerful tool for social communication—the face, which can elicit multiple social perceptions in others due to the rich and complex variations of its movements, morphology, and complexion. Consequently, identifying precisely what face information elicits different social perceptions is a complex empirical challenge that has largely remained beyond the reach of traditional research methods. More recently, the emerging field of social psychophysics has developed new methods designed to address this challenge. Here, we introduce and review the foundational methodological developments of social psychophysics, present recent work that has advanced our understanding of the face as a tool for social communication, and discuss the main challenges that lie ahead

    Toward dynamic pain expressions in avatars : perceived realism and pain level of diffrent action unit orders

    Get PDF
    The facial expression of pain can be decomposed in three sets of Action Units (AUs), the smallest discriminating facial movements: Brow lowering (B), Nose wrinkling + Upper lip raising (N), and Orbit tightening + Eyelid closure (O). This study compared the perception of realism and pain level from different onset orders of AUs in avatars. Seven videos of facial expressions of pain were created with four different avatars (2 women): six sequential onsets combining the three sets of AUs and one synchronized onset. 45 healthy adults (22 women; aged 23.6 ± 5.2 years) rated the realism of facial movements, and the level of intensity and unpleasantness of perceived pain. A more realistic expression was associated with the onset of O before or at the same time as N, a more intense expression was associated when B occurred last, and a higher level of unpleasantness was associated with the onset of N before B. Therefore, the sequence ONB yielded the highest ratings on both measures of realism and pain levels. These findings describe the perceived content of different orders of facial movements that could contribute to the creation of realistic pain-expressing virtual agents designed to study human-computer interactions
    • …
    corecore