3 research outputs found

    Computational analysis of smile weight distribution across the face for accurate distinction between genuine and posed smiles

    Get PDF
    YesIn this paper, we report the results of our recent research into the understanding of the exact distribution of a smile across the face, especially the distinction in the weight distribution of a smile between a genuine and a posed smile. To do this, we have developed a computational framework for the analysis of the dynamic motion of various parts of the face during a facial expression, in particular, for the smile expression. The heart of our dynamic smile analysis framework is the use of optical flow intensity variation across the face during a smile. This can be utilised to efficiently map the dynamic motion of individual regions of the face such as the mouth, cheeks and areas around the eyes. Thus, through our computational framework, we infer the exact distribution of weights of the smile across the face. Further, through the utilisation of two publicly available datasets, namely the CK+ dataset with 83 subjects expressing posed smiles and the MUG dataset with 35 subjects expressing genuine smiles, we show there is a far greater activity or weight distribution around the regions of the eyes in the case of a genuine smile.Supported in part by the European Union's Horizon 2020 Programme H2020-MSCA-RISE-2017, under the project PDE-GIR with grant number 778035

    On Gender Identification Using the Smile Dynamics

    No full text
    NoGender classification has multiple applications including, but not limited to, face perception, age, ethnicity and identity analysis, video surveillance and smart human computer interaction. The majority of computer based gender classification algorithms analyse the appearance of facial features predominantly based on the texture of the static image of the face. In this paper, we propose a novel algorithm for gender classification using the smile dynamics without resorting to the use of any facial texture information. Our experiments suggest that this method has great potential for finding indicators of gender dimorphism. Our approach was tested on two databases, namely the CK+ and the MUG, consisting of a total of 80 subjects. As a result, using the KNN algorithm along with 10-fold cross validation, we achieve an accurate classification rate of 80% for gender simply based on the dynamics of a person's smile
    corecore