3 research outputs found

    Feature Extraction Techniques for Human Emotion Identification from Face Images

    Get PDF
    Emotion recognition has been one of the stimulating issues over the years due to the irregularities in the complexity of models and unpredictability between expression categories. So many Emotion detection algorithms have developed in the last two decades and still facing problems in accuracy, complexity and real-world implementation. In this paper, we propose two feature extraction techniques: Mouth region-based feature extraction and Maximally Stable Extremal Regions (MSER) method. In Mouth based feature extraction method mouth area is calculated and based on that value the emotions are classified. In the MSER method, the features are extracted by using connecting components and then the extracted features are given to a simple ANN for classification. Experimental results shows that the Mouth area based feature extraction method gives 86% accuracy and MSER based feature extraction method outperforms it by achieving 89% accuracy on DEAP. Thus, it can be concluded that the proposed methods can be effectively used for emotion detection

    Smart classroom monitoring using novel real-time facial expression recognition system

    Get PDF
    Featured Application: The proposed automatic emotion recognition system has been deployed in the classroom environment (education) but it can be used anywhere to monitor the emotions of humans, i.e., health, banking, industries, social welfare etc. Abstract: Emotions play a vital role in education. Technological advancement in computer vision using deep learning models has improved automatic emotion recognition. In this study, a real-time automatic emotion recognition system is developed incorporating novel salient facial features for classroom assessment using a deep learning model. The proposed novel facial features for each emotion are initially detected using HOG for face recognition, and automatic emotion recognition is then performed by training a convolutional neural network (CNN) that takes real-time input from a camera deployed in the classroom. The proposed emotion recognition system will analyze the facial expressions of each student during learning. The selected emotional states are happiness, sadness, and fear along with the cognitive–emotional states of satisfaction, dissatisfaction, and concentration. The selected emotional states are tested against selected variables gender, department, lecture time, seating positions, and the difficulty of a subject. The proposed system contributes to improve classroom learning.Web of Science1223art. no. 1213
    corecore