Enabling dynamics in face analysis

Abstract

Most of the approaches in automatic face analysis rely solely on static appearance. However, temporal analysis of expressions reveals interesting patterns. For a better understanding of the human face, this thesis focuses on temporal changes in the face, and dynamic patterns of expressions. In addition to improving the state of the art in several areas of automatic face analysis, the present thesis introduces new and significant findings on facial dynamics. The contributions on temporal analysis and understanding of faces can be summarized as follows: 1) An accurate facial landmarking method is proposed to enable detailed analysis of facial movements; 2) Dynamic feature descriptors are introduced to reveal the temporal patterns of facial expressions; 3) Various frameworks are proposed to exploit temporal information and facial dynamics in expression spontaneity analysis, age estimation, and kinship verification; 4) An affect-responsive system is designed to create an adaptive application empowered by face-to-face human-computer interaction. We believe that affective technologies will shape the future by providing a more natural form of human-machine interaction. To this end, the proposed methods and ideas may lead to more efficient uses of the temporal information and dynamic features in face processing and affective computing

    Similar works