45,201 research outputs found

    Morphological variants of silent bared-teeth displays have different social interaction outcomes in crested macaques (Macaca nigra)

    Get PDF
    Objectives: While it has been demonstrated that even subtle variation in human facial expressions can lead to significant changes in the meaning and function of expressions, relatively few studies have examined primate facial expressions using similarly objective and rigorous analysis. Construction of primate facial expression repertoires may, therefore, be oversimplified, with expressions often arbitrarily pooled and/or split into subjective pigeonholes. Our objective is to assess whether subtle variation in primate facial expressions is linked to variation in function, and hence to inform future attempts to quantify complexity of facial communication. Materials and Methods: We used Macaque Facial Action Coding System, an anatomically based and hence more objective tool, to quantify “silent bared‐teeth” (SBT) expressions produced by wild crested macaques engaging in spontaneous behavior, and utilized discriminant analysis and bootstrapping analysis to look for morphological differences between SBT produced in four different contexts, defined by the outcome of interactions: Affiliation, Copulation, Play, and Submission. Results: We found that SBT produced in these contexts could be distinguished at significantly above‐chance rates, indicating that the expressions produced in these four contexts differ morphologically. We identified the specific facial movements that were typically used in each context, and found that the variability and intensity of facial movements also varied between contexts. Discussion: These results indicate that nonhuman primate facial expressions share the human characteristic of exhibiting meaningful subtle differences. Complexity of facial communication may not be accurately represented simply by building repertoires of distinct expressions, so further work should attempt to take this subtle variability into account

    Emotions in context: examining pervasive affective sensing systems, applications, and analyses

    Get PDF
    Pervasive sensing has opened up new opportunities for measuring our feelings and understanding our behavior by monitoring our affective states while mobile. This review paper surveys pervasive affect sensing by examining and considering three major elements of affective pervasive systems, namely; “sensing”, “analysis”, and “application”. Sensing investigates the different sensing modalities that are used in existing real-time affective applications, Analysis explores different approaches to emotion recognition and visualization based on different types of collected data, and Application investigates different leading areas of affective applications. For each of the three aspects, the paper includes an extensive survey of the literature and finally outlines some of challenges and future research opportunities of affective sensing in the context of pervasive computing

    Group-level Emotion Recognition using Transfer Learning from Face Identification

    Full text link
    In this paper, we describe our algorithmic approach, which was used for submissions in the fifth Emotion Recognition in the Wild (EmotiW 2017) group-level emotion recognition sub-challenge. We extracted feature vectors of detected faces using the Convolutional Neural Network trained for face identification task, rather than traditional pre-training on emotion recognition problems. In the final pipeline an ensemble of Random Forest classifiers was learned to predict emotion score using available training set. In case when the faces have not been detected, one member of our ensemble extracts features from the whole image. During our experimental study, the proposed approach showed the lowest error rate when compared to other explored techniques. In particular, we achieved 75.4% accuracy on the validation data, which is 20% higher than the handcrafted feature-based baseline. The source code using Keras framework is publicly available.Comment: 5 pages, 3 figures, accepted for publication at ICMI17 (EmotiW Grand Challenge
    • 

    corecore