20,306 research outputs found

    Discrete Choice Models for Static Facial Expression Recognition

    Get PDF
    In this paper we propose the use of Discrete Choice Analysis (DCA) for static facial expression classification. Facial expressions are described with expression descriptive units (EDU), consisting in a set of high level features derived from an active appearance model (AAM). The discrete choice model (DCM) is built considering the 6 universal facial expressions plus the neutral one as the set of the available alternatives. Each alternative is described by an utility function, defined as the sum of a linear combination of EDUs and a random term capturing the uncertainty. The utilities provide a measure of likelihood for a combinations of EDUs to represent a certain facial expression. They represent a natural way for the modeler to formalize her prior knowledge on the process. The model parameters are learned through maximum likelihood estimation and classification is performed assigning each test sample to the alternative showing the maximum utility. We compare the performance of the DCM classifier against Linear Discriminant Analysis (LDA), Generalized Discriminant Analysis (GDA), Relevant Component Analysis (RCA) and Support Vector Machine (SVM). Quantitative preliminary results are reported, showing good and encouraging performance of the DCM approach both in terms of recognition rate and discriminatory power

    Machine Analysis of Facial Expressions

    Get PDF
    No abstract

    Machine Understanding of Human Behavior

    Get PDF
    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should be about anticipatory user interfaces that should be human-centered, built for humans based on human models. They should transcend the traditional keyboard and mouse to include natural, human-like interactive functions including understanding and emulating certain human behaviors such as affective and social signaling. This article discusses a number of components of human behavior, how they might be integrated into computers, and how far we are from realizing the front end of human computing, that is, how far are we from enabling computers to understand human behavior

    Four not six: revealing culturally common facial expressions of emotion

    Get PDF
    As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin’s work, identifying amongst these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing six emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication, supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modelling the facial expressions of over 60 emotions across two cultures, and segregating out the latent expressive patterns. Using a multi-disciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in two cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing four latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that six facial expression patterns are universal, instead suggesting four latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics
    • …
    corecore