1,050 research outputs found

    EMPATH: A Neural Network that Categorizes Facial Expressions

    Get PDF
    There are two competing theories of facial expression recognition. Some researchers have suggested that it is an example of "categorical perception." In this view, expression categories are considered to be discrete entities with sharp boundaries, and discrimination of nearby pairs of expressive faces is enhanced near those boundaries. Other researchers, however, suggest that facial expression perception is more graded and that facial expressions are best thought of as points in a continuous, low-dimensional space, where, for instance, "surprise" expressions lie between "happiness" and "fear" expressions due to their perceptual similarity. In this article, we show that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories. Without any parameter tuning, the model matches a variety of psychological data on categorization, similarity, reaction times, discrimination, and recognition difficulty, both qualitatively and quantitatively. We thus explain many of the seemingly complex psychological phenomena related to facial expression perception as natural consequences of the tasks' implementations in the brain

    Human Automotive Interaction: Affect Recognition for Motor Trend Magazine\u27s Best Driver Car of the Year

    Get PDF
    Observation analysis of vehicle operators has the potential to address the growing trend of motor vehicle accidents. Methods are needed to automatically detect heavy cognitive load and distraction to warn drivers in poor psychophysiological state. Existing methods to monitor a driver have included prediction from steering behavior, smart phone warning systems, gaze detection, and electroencephalogram. We build upon these approaches by detecting cues that indicate inattention and stress from video. The system is tested and developed on data from Motor Trend Magazine\u27s Best Driver Car of the Year 2014 and 2015. It was found that face detection and facial feature encoding posed the most difficult challenges to automatic facial emotion recognition in practice. The chapter focuses on two important parts of the facial emotion recognition pipeline: (1) face detection and (2) facial appearance features. We propose a face detector that unifies state‐of‐the‐art approaches and provides quality control for face detection results, called reference‐based face detection. We also propose a novel method for facial feature extraction that compactly encodes the spatiotemporal behavior of the face and removes background texture, called local anisotropic‐inhibited binary patterns in three orthogonal planes. Real‐world results show promise for the automatic observation of driver inattention and stress

    Machine Analysis of Facial Expressions

    Get PDF
    No abstract

    Facial Emotion Recognition Based on Empirical Mode Decomposition and Discrete Wavelet Transform Analysis

    Get PDF
    This paper presents a new framework of using empirical mode decomposition (EMD) and discrete wavelet transform (DWT) with an application for facial emotion recognition. EMD is a multi-resolution technique used to decompose any complicated signal into a small set of intrinsic mode functions (IMFs) based on sifting process. In this framework, the EMD was applied on facial images to extract the informative features by decomposing the image into a set of IMFs and residue. The selected IMFs was then subjected to DWT in which it decomposes the instantaneous frequency of the IMFs into four sub band. The approximate coefficients (cA1) at first level decomposition are extracted and used as significant features to recognize the facial emotion. Since there are a large number of coefficients, hence the principal component analysis (PCA) is applied to the extracted features. The k-nearest neighbor classifier is adopted as a classifier to classify seven facial emotions (anger, disgust, fear, happiness, neutral, sadness and surprise). To evaluate the effectiveness of the proposed method, the JAFFE database has been employed. Based on the results obtained, the proposed method demonstrates the recognition rate of 80.28%, thus it is converging

    Automatic Emotion Recognition from Mandarin Speech

    Get PDF
    • 

    corecore