research

Creating and annotating affect databases from face and body display: A contemporary survey

Abstract

Databases containing representative samples of human multi-modal expressive behavior are needed for the development of affect recognition systems. However, at present publicly-available databases exist mainly for single expressive modalities such as facial expressions, static and dynamic hand postures, and dynamic hand gestures. Only recently, a first bimodal affect database consisting of expressive face and upperbody display has been released. To foster development of affect recognition systems, this paper presents a comprehensive survey of the current state-of-the art in affect database creation from face and body display and elicits the requirements of an ideal multi-modal affect database. © 2006 IEEE

    Similar works

    Full text

    thumbnail-image

    Available Versions

    Last time updated on 20/07/2021