4,924 research outputs found

    EquiFACS: the Equine Facial Action Coding System

    Get PDF
    Although previous studies of horses have investigated their facial expressions in specific contexts, e.g. pain, until now there has been no methodology available that documents all the possible facial movements of the horse and provides a way to record all potential facial configurations. This is essential for an objective description of horse facial expressions across a range of contexts that reflect different emotional states. Facial Action Coding Systems (FACS) provide a systematic methodology of identifying and coding facial expressions on the basis of underlying facial musculature and muscle movement. FACS are anatomically based and document all possible facial movements rather than a configuration of movements associated with a particular situation. Consequently, FACS can be applied as a tool for a wide range of research questions. We developed FACS for the domestic horse (Equus caballus) through anatomical investigation of the underlying musculature and subsequent analysis of naturally occurring behaviour captured on high quality video. Discrete facial movements were identified and described in terms of the underlying muscle contractions, in correspondence with previous FACS systems. The reliability of others to be able to learn this system (EquiFACS) and consistently code behavioural sequences was high—and this included people with no previous experience of horses. A wide range of facial movements were identified, including many that are also seen in primates and other domestic animals (dogs and cats). EquiFACS provides a method that can now be used to document the facial movements associated with different social contexts and thus to address questions relevant to understanding social cognition and comparative psychology, as well as informing current veterinary and animal welfare practices

    Automatic analysis of facial actions: a survey

    Get PDF
    As one of the most comprehensive and objective ways to describe facial expressions, the Facial Action Coding System (FACS) has recently received significant attention. Over the past 30 years, extensive research has been conducted by psychologists and neuroscientists on various aspects of facial expression analysis using FACS. Automating FACS coding would make this research faster and more widely applicable, opening up new avenues to understanding how we communicate through facial expressions. Such an automated process can also potentially increase the reliability, precision and temporal resolution of coding. This paper provides a comprehensive survey of research into machine analysis of facial actions. We systematically review all components of such systems: pre-processing, feature extraction and machine coding of facial actions. In addition, the existing FACS-coded facial expression databases are summarised. Finally, challenges that have to be addressed to make automatic facial action analysis applicable in real-life situations are extensively discussed. There are two underlying motivations for us to write this survey paper: the first is to provide an up-to-date review of the existing literature, and the second is to offer some insights into the future of machine recognition of facial actions: what are the challenges and opportunities that researchers in the field face

    Assessing the convergent validity between the automated emotion recognition software Noldus FaceReader 7 and Facial Action Coding System Scoring

    Get PDF
    This study validates automated emotion and action unit (AU) coding applying FaceReader 7 to a dataset of standardized facial expressions of six basic emotions (Standardized and Motivated Facial Expressions of Emotion). Percentages of correctly and falsely classified expressions are reported. The validity of coding AUs is provided by correlations between the automated analysis and manual Facial Action Coding System (FACS) scoring for 20 AUs. On average 80% of the emotional facial expressions are correctly classified. The overall validity of coding AUs is moderate with the highest validity indicators for AUs 1, 5, 9, 17 and 27. These results are compared to the performance of FaceReader 6 in previous research, with our results yielding comparable validity coefficients. Practical implications and limitations of the automated method are discussed

    The Face of Ambivalence: Simultaneous Expressions of Positive and Negative Emotions During Cue-elicited Craving in Heavy Smokers

    Get PDF
    This study used the Facial Action Coding System (FACS; P. Ekman & W. V. Friesen, 1978) to examine abstinent smokers' immediate facial responses while exposed to smoking cues. The aim was to investigate potential associations between facial expressions thought to be linked to ambivalence and more traditional measures of ambivalence about their smoking habits. Ambivalence during cue exposure was operationalized as the simultaneous occurrence of positive and negative affect-related facial expressions. Thirty-four nicotine-deprived dependent smokers were presented with in vivo smoking cues, and their facial expressions were coded with the FACS; participants also completed self-report measures related to ambivalence about smoking. Smokers who displayed ambivalent facial expressions during smoking cue exposure reported significantly higher scores on three out of four measures of smoking ambivalence than did those who did not display ambivalent facial expressions. This effect was unique to those smokers displaying simultaneous positive and negative affect-related facial expressions, and the effect was not demonstrated in smokers' displaying just positive, just negative, or sequential instances of positive and negative affect-related expressions

    Imitating individualized facial expressions in a human-like avatar through a hybrid particle swarm optimization - tabu search algorithm

    Get PDF
    This thesis describes a machine learning method for automatically imitating a particular person\u27s facial expressions in a human-like avatar through a hybrid Particle Swarm Optimization - Tabu Search algorithm. The muscular structures of the facial expressions are measured by Ekman and Friesen\u27s Facial Action Coding System (FACS). Using a neutral face as a reference, the minute movements of the Action Units, used in FACS, are automatically tracked and mapped onto the avatar using a hybrid method. The hybrid algorithm is composed of Kennedy and Eberhart\u27s Particle Swarm Optimization algorithm (PSO) and Glover\u27s Tabu Search (TS). Distinguishable features portrayed on the avatar ensure a personalized, realistic imitation of the facial expressions. To evaluate the feasibility of using PSO-TS in this approach, a fundamental proof-of-concept test is employed on the system using the OGRE avatar. This method is analyzed in-depth to ensure its proper functionality and evaluate its performance compared to previous work

    FACSGen 2.0 animation software: Generating 3D FACS-valid facial expressions for emotion research

    Get PDF
    In this article, we present FACSGen 2.0, new animation software for creating static and dynamic three-dimensional facial expressions on the basis of the Facial Action Coding System (FACS). FACSGen permits total control over the action units (AUs), which can be animated at all levels of intensity and applied alone or in combination to an infinite number of faces. In two studies, we tested the validity of the software for the AU appearance defined in the FACS manual and the conveyed emotionality of FACSGen expressions. In Experiment 1, four FACS-certified coders evaluated the complete set of 35 single AUs and 54 AU combinations for AU presence or absence, appearance quality, intensity, and asymmetry. In Experiment 2, lay participants performed a recognition task on emotional expressions created with FACSGen software and rated the similarity of expressions displayed by human and FACSGen faces. Results showed good to excellent classification levels for all AUs by the four FACS coders, suggesting that the AUs are valid exemplars of FACS specifications. Lay participants' recognition rates for nine emotions were high, and comparisons of human and FACSGen expressions were very similar. The findings demonstrate the effectiveness of the software in producing reliable and emotionally valid expressions, and suggest its application in numerous scientific areas, including perception, emotion, and clinical and neuroscience research
    • 

    corecore