5,709 research outputs found

    On combining the facial movements of a talking head

    Get PDF
    We present work on Obie, an embodied conversational agent framework. An embodied conversational agent, or talking head, consists of three main components. The graphical part consists of a face model and a facial muscle model. Besides the graphical part, we have implemented an emotion model and a mapping from emotions to facial expressions. The animation part of the framework focuses on the combination of different facial movements temporally. In this paper we propose a scheme of combining facial movements on a 3D talking head

    Cultural dialects of real and synthetic emotional facial expressions

    Get PDF
    In this article we discuss the aspects of designing facial expressions for virtual humans (VHs) with a specific culture. First we explore the notion of cultures and its relevance for applications with a VH. Then we give a general scheme of designing emotional facial expressions, and identify the stages where a human is involved, either as a real person with some specific role, or as a VH displaying facial expressions. We discuss how the display and the emotional meaning of facial expressions may be measured in objective ways, and how the culture of displayers and the judges may influence the process of analyzing human facial expressions and evaluating synthesized ones. We review psychological experiments on cross-cultural perception of emotional facial expressions. By identifying the culturally critical issues of data collection and interpretation with both real and VHs, we aim at providing a methodological reference and inspiration for further research

    EMPATH: A Neural Network that Categorizes Facial Expressions

    Get PDF
    There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain

    Multimodal Intelligent Tutoring Systems

    Get PDF

    Facial Emotional Expressions Of Life-Like Character Based On Text Classifier And Fuzzy Logic

    Get PDF
    A system consists of a text classifier and Fuzzy Inference System FIS to build a life-like virtual character capable of expressing emotion from a text input is proposed. The system classifies emotional content of sentences from text input and expresses corresponding emotion by a facial expression. Text input is classified using the text classifier while facial expression of the life-like character are controlled by FIS utilizing results from the text classifier. A number of text classifier methods are employed and their performances are evaluated using Leave-One-Out Cross Validation. In real world application such as animation movie the lifelike virtual character of proposed system needs to be animated. As a demonstration examples of facial expressions with corresponding text input as results from the implementation of our system are shown. The system is able to show facial expressions with admixture blending emotions. This paper also describes animation characteristics of the system using neutral expression as center of facial expression transition from one emotion to another. Emotion transition can be viewed as gradual decrease or increase of emotion intensity from one emotion toward other emotion. Experimental results show that animation of lifelike character expressing emotion transition can be generated automatically using proposed system

    Affect and believability in game characters:a review of the use of affective computing in games

    Get PDF
    Virtual agents are important in many digital environments. Designing a character that highly engages users in terms of interaction is an intricate task constrained by many requirements. One aspect that has gained more attention recently is the effective dimension of the agent. Several studies have addressed the possibility of developing an affect-aware system for a better user experience. Particularly in games, including emotional and social features in NPCs adds depth to the characters, enriches interaction possibilities, and combined with the basic level of competence, creates a more appealing game. Design requirements for emotionally intelligent NPCs differ from general autonomous agents with the main goal being a stronger player-agent relationship as opposed to problem solving and goal assessment. Nevertheless, deploying an affective module into NPCs adds to the complexity of the architecture and constraints. In addition, using such composite NPC in games seems beyond current technology, despite some brave attempts. However, a MARPO-type modular architecture would seem a useful starting point for adding emotions

    Building Embodied Agents That Experience and Express Emotions: A Football Supporter as an Example

    Get PDF
    agent that experiences and expresses emotions. Obie has an adaptive, quantitative and domain-independent emotion component which appraises events to trigger emotions. Obie's emotions are expressed via his utterances or his facial expressions. The expression via utterances is done by a simple mapping from emotions to text fragments. The mapping from emotions to facial expressions is done by a fuzzy rule-based system. Obie's utterances and facial expressions are presented in his 3D talking head. In the research described in this paper, Obie was implemented as a football supporter agent. We show how Obie experiences different emotions during a football match. We also indicate how Obie with different personalities experiences emotions differently
    corecore