5,000 research outputs found

    Sparsity in Dynamics of Spontaneous Subtle Emotions: Analysis \& Application

    Full text link
    Spontaneous subtle emotions are expressed through micro-expressions, which are tiny, sudden and short-lived dynamics of facial muscles; thus poses a great challenge for visual recognition. The abrupt but significant dynamics for the recognition task are temporally sparse while the rest, irrelevant dynamics, are temporally redundant. In this work, we analyze and enforce sparsity constrains to learn significant temporal and spectral structures while eliminate irrelevant facial dynamics of micro-expressions, which would ease the challenge in the visual recognition of spontaneous subtle emotions. The hypothesis is confirmed through experimental results of automatic spontaneous subtle emotion recognition with several sparsity levels on CASME II and SMIC, the only two publicly available spontaneous subtle emotion databases. The overall performances of the automatic subtle emotion recognition are boosted when only significant dynamics are preserved from the original sequences.Comment: IEEE Transaction of Affective Computing (2016

    Spontaneous facial expression analysis using optical flow

    Full text link
    © 2017 IEEE. Investigation of emotions manifested through facial expressions has valuable applications in predictive behavioural studies. This has piqued interest towards developing intelligent visual surveillance using facial expression analysis coupled with Closed Circuit Television (CCTV). However, a facial recognition program tailored to evaluating facial behaviour for forensic and security purposes can be met if patterns of emotions in general can be detected. The present study assesses whether emotional expression derived from frontal or profile views of the face can be used to determine differences between three emotions: Amusement, Sadness and Fear using the optical flow technique. Analysis was in the form of emotion maps constructed from feature vectors obtained from using the Lucas-Kanade implementation of optical flow. These feature vectors were selected as inputs for classification. It was anticipated that the findings would assist in improving the optical flow algorithm for feature extraction. However, further data analyses are necessary to confirm if different types of emotion can be identified clearly using optical flow or other such techniques

    EMPATH: A Neural Network that Categorizes Facial Expressions

    Get PDF
    There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain
    • …
    corecore