31 research outputs found

    The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    Get PDF
    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

    Cross-Cultural Perspectives on Emotion Expressive Humanoid Robotic Head: Recognition of Facial Expressions and Symbols

    No full text
    This article is closed access.Emotion display through facial expressions is an important channel of communication. However, between humans there are differences in the way a meaning to facial cues is assigned, depending on the background culture. This leads to a gap in recognition rates of expressions: this problem is present when displaying a robotic face too, as a robot’s facial expression recognition is often hampered by a cultural divide, and poor scores of recognition rate may lead to poor acceptance and interaction. It would be desirable if robots could switch their output facial configuration flexibly, adapting to different cultural backgrounds. To achieve this, we made a generation system that produces facial expressions and applied it to the 24 degrees of freedom head of the humanoid social robot KOBIAN-R, and thanks to the work of illustrators and cartoonists, the system can generate two versions of the same expression, in order to be easily recognisable by both Japanese and Western subjects. As a tool for making recognition easier, the display of Japanese comic symbols on the robotic face has also been introduced and evaluated. In this work, we conducted a cross-cultural study aimed at assessing this gap in recognition and finding solutions for it. The investigation was extended to Egyptian subjects too, as a sample of another different culture. Results confirmed the differences in recognition rates, the effectiveness of customising expressions, and the usefulness of symbols display, thereby suggesting that this approach might be valuable for robots that in the future will interact in a multi-cultural environment
    corecore