18,490 research outputs found

    Interactive Virtual Training: Implementation for Early Career Teachers to Practice Classroom Behavior Management

    Get PDF
    Teachers that are equipped with the skills to manage and prevent disruptive behaviors increase the potential for their students to achieve academically and socially. Student success increases when prevention strategies and effective classroom behavior management (CBM) are implemented in the classroom. However, teachers with less than 5 years of experience, early career teachers (ECTs), are ill equipped to handle disruptive students. ECTs describe disruptive behaviors as a major factor for stress given their limited training in CBM. As a result, disruptive behaviors are reported by ECTs as one of the main reasons for leaving the field. Virtual training environments (VTEs) combined with advances in virtual social agents can support the training of CBM. Although VTEs for teachers already exist, requirements to guide future research and development of similar training systems have not been defined. We propose a set of six requirements for VTEs for teachers. Our requirements were established from a survey of the literature and from iterative lifecycle activities to build our own VTE for teachers. We present different evaluations of our VTE using methodologies and metrics we developed to assess whether all requirements were met. Our VTE simulates interactions with virtual animated students based on real classroom situations to help ECTs practice their CBM. We enhanced our classroom simulator to further explore two aspects of our requirements: interaction devices and emotional virtual agents. Interactions devices were explored by comparing the effect of immersive technologies on users\u27 experience (UX) such as presence, co-presence, engagement and believability. We adapted our VTE originally built for desktop computer, to be compatible with two immersive VR platforms. Results show that our VTE generates high levels of UX across all VR platforms. Furthermore, we enhanced our virtual students to display emotions using facial expressions as current studies do not address whether emotional virtual agents provide the same level of UX across different VR platforms. We assessed the effects of VR platforms and display of emotions on UX. Our analysis shows that facial expressions have greater impact when using a desktop computer. We propose future work on immersive VTEs using emotional virtual agents

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Dynamic Facial Expression of Emotion Made Easy

    Full text link
    Facial emotion expression for virtual characters is used in a wide variety of areas. Often, the primary reason to use emotion expression is not to study emotion expression generation per se, but to use emotion expression in an application or research project. What is then needed is an easy to use and flexible, but also validated mechanism to do so. In this report we present such a mechanism. It enables developers to build virtual characters with dynamic affective facial expressions. The mechanism is based on Facial Action Coding. It is easy to implement, and code is available for download. To show the validity of the expressions generated with the mechanism we tested the recognition accuracy for 6 basic emotions (joy, anger, sadness, surprise, disgust, fear) and 4 blend emotions (enthusiastic, furious, frustrated, and evil). Additionally we investigated the effect of VC distance (z-coordinate), the effect of the VC's face morphology (male vs. female), the effect of a lateral versus a frontal presentation of the expression, and the effect of intensity of the expression. Participants (n=19, Western and Asian subjects) rated the intensity of each expression for each condition (within subject setup) in a non forced choice manner. All of the basic emotions were uniquely perceived as such. Further, the blends and confusion details of basic emotions are compatible with findings in psychology

    Cultural dialects of real and synthetic emotional facial expressions

    Get PDF
    In this article we discuss the aspects of designing facial expressions for virtual humans (VHs) with a specific culture. First we explore the notion of cultures and its relevance for applications with a VH. Then we give a general scheme of designing emotional facial expressions, and identify the stages where a human is involved, either as a real person with some specific role, or as a VH displaying facial expressions. We discuss how the display and the emotional meaning of facial expressions may be measured in objective ways, and how the culture of displayers and the judges may influence the process of analyzing human facial expressions and evaluating synthesized ones. We review psychological experiments on cross-cultural perception of emotional facial expressions. By identifying the culturally critical issues of data collection and interpretation with both real and VHs, we aim at providing a methodological reference and inspiration for further research

    Towards the improvement of self-service systems via emotional virtual agents

    Get PDF
    Affective computing and emotional agents have been found to have a positive effect on human-computer interactions. In order to develop an acceptable emotional agent for use in a self-service interaction, two stages of research were identified and carried out; the first to determine which facial expressions are present in such an interaction and the second to determine which emotional agent behaviours are perceived as appropriate during a problematic self-service shopping task. In the first stage, facial expressions associated with negative affect were found to occur during self-service shopping interactions, indicating that facial expression detection is suitable for detecting negative affective states during self-service interactions. In the second stage, user perceptions of the emotional facial expressions displayed by an emotional agent during a problematic self-service interaction were gathered. Overall, the expression of disgust was found to be perceived as inappropriate while emotionally neutral behaviour was perceived as appropriate, however gender differences suggested that females perceived surprise as inappropriate. Results suggest that agents should change their behaviour and appearance based on user characteristics such as gender

    On combining the facial movements of a talking head

    Get PDF
    We present work on Obie, an embodied conversational agent framework. An embodied conversational agent, or talking head, consists of three main components. The graphical part consists of a face model and a facial muscle model. Besides the graphical part, we have implemented an emotion model and a mapping from emotions to facial expressions. The animation part of the framework focuses on the combination of different facial movements temporally. In this paper we propose a scheme of combining facial movements on a 3D talking head

    Designing and Implementing Embodied Agents: Learning from Experience

    Get PDF
    In this paper, we provide an overview of part of our experience in designing and implementing some of the embodied agents and talking faces that we have used for our research into human computer interaction. We focus on the techniques that were used and evaluate this with respect to the purpose that the agents and faces were to serve and the costs involved in producing and maintaining the software. We discuss the function of this research and development in relation to the educational programme of our graduate students
    corecore