An expressive ECA showing complex emotions

Abstract

Embodied Conversational Agents (ECAs) are a new paradigm of computer interface with a human-like aspect that allow users to interact with the machine through natural speech, gestures, facial expressions, and gaze. In this paper we present an head animation system for our ECA Greta and we focus on two of its aspects: the expressivity of movement and the computation of complex facial expressions. The system synchronises the nonverbal behaviours of the agent with the verbal stream of her speech; moreover it allows us to qualitatively modify the animation of the agent, that is to add expressivity to the agent's movements. Our model of facial expressions embeds not only the expressions of the set of basic emotions (e.g., anger, sadness, fear) but also different types of complex expressions like fake, inhibited, and masked expressions

    Similar works