1 research outputs found

    Latent Facial Topics for affect analysis

    No full text
    Recent years have seen a growing need in the affective computing community to understand an emotion space beyond the seven basic expressions, leading to explorations of an emotion space continuum spanned by dimensions such as valence and arousal. While there has been substantial research in the identification of facial Action Units as building blocks for the basic expressions, there is a new need to discover fine-grained facial descriptors that can explain the variations in the continuum of emotions. We propose a methodology to extract Latent Facial Topics (LFTs) from facial videos, by adapting Latent Dirichlet Allocation and supervised Latent Dirichlet Allocation topic models for facial affect analysis. In this work, we study the application of topic models to both discrete emotion recognition as well as continuous emotion prediction tasks. We show that meaningful and visualizable LFTs can be extracted and used successfully for emotion recognition. We report our recognition results on the widely known Cohn Kanade Plus and AVEC 2012 FCSC challenge data sets, which have shown promise for both discrete and continuous emotion recognition problems
    corecore