4 research outputs found

    Emotion Recognition from Skeletal Movements

    Get PDF
    Automatic emotion recognition has become an important trend in many artificial intelligence (AI) based applications and has been widely explored in recent years. Most research in the area of automated emotion recognition is based on facial expressions or speech signals. Although the influence of the emotional state on body movements is undeniable, this source of expression is still underestimated in automatic analysis. In this paper, we propose a novel method to recognise seven basic emotional states-namely, happy, sad, surprise, fear, anger, disgust and neutral-utilising body movement. We analyse motion capture data under seven basic emotional states recorded by professional actor/actresses using Microsoft Kinect v2 sensor. We propose a new representation of affective movements, based on sequences of body joints. The proposed algorithm creates a sequential model of affective movement based on low level features inferred from the spacial location and the orientation of joints within the tracked skeleton. In the experimental results, different deep neural networks were employed and compared to recognise the emotional state of the acquired motion sequences. The experimental results conducted in this work show the feasibility of automatic emotion recognition from sequences of body gestures, which can serve as an additional source of information in multimodal emotion recognition

    Two-stage Recognition and Beyond for Compound Facial Emotion Recognition

    Get PDF
    Facial emotion recognition is an inherently complex problem due to individual diversity in facial features and racial and cultural differences. Moreover, facial expressions typically reflect the mixture of people’s emotional statuses, which can be expressed using compound emotions. Compound facial emotion recognition makes the problem even more difficult because the discrimination between dominant and complementary emotions is usually weak. We have created a database that includes 31,250 facial images with different emotions of 115 subjects whose gender distribution is almost uniform to address compound emotion recognition. In addition, we have organized a competition based on the proposed dataset, held at FG workshop 2020. This paper analyzes the winner’s approach—a two-stage recognition method (1st stage, coarse recognition; 2nd stage, fine recognition), which enhances the classification of symmetrical emotion labels

    How are you Feeling? Inferring Emotions through Movements in the Metaverse

    Get PDF
    Metaverses are immersive virtual worlds in which people interact as avatars. There is emerging interest in understanding how metaverse users behave and perceive activities and tasks. Our understanding of users’ behavior within metaverses is limited. This study examines the role of emotions in the movement of individuals. We therefore implement a metaverse setting using virtual reality technology and development tools. In our study, we manipulated negative emotions and tracked the movements of our participants. We show how negative emotion influences movements in a metaverse setting. Based on a literature review, we select and calculate movement features to train a support vector machine. As our result, we present a novel way to infer the negative emotions of metaverse users which will help create more engaging and immersive experiences that cater to user’s emotions and behaviors. Our study provides preliminary evidence for the potential utilization of movement data in the metaverse

    A Comprehensive Study on State-Of-Art Learning Algorithms in Emotion Recognition

    Get PDF
    The potential uses of emotion recognition in domains like human-robot interaction, marketing, emotional gaming, and human-computer interface have made it a prominent research subject. Better user experiences can result from the development of technologies that can accurately interpret and respond to human emotions thanks to a better understanding of emotions. The use of several sensors and computational algorithms is the main emphasis of this paper's thorough analysis of the developments in emotion recognition techniques. Our results show that using more than one modality improves the performance of emotion recognition when a variety of metrics and computational techniques are used. This paper adds to the body of knowledge by thoroughly examining and contrasting several state-of-art computational techniques and measurements for emotion recognition. The study emphasizes how crucial it is to use a variety of modalities along with cutting-edge machine learning algorithms in order to attain more precise and trustworthy emotion assessment. Additionally, we pinpoint prospective avenues for additional investigation and advancement, including the incorporation of multimodal data and the investigation of innovative features and fusion methodologies. This study contributes to the development of technology that can better comprehend and react to human emotions by offering practitioners and academics in the field of emotion recognition insightful advice
    corecore