15,813 research outputs found

    Towards Emotion Recognition: A Persistent Entropy Application

    Get PDF
    Emotion recognition and classification is a very active area of research. In this paper, we present a first approach to emotion classification using persistent entropy and support vector machines. A topology-based model is applied to obtain a single real number from each raw signal. These data are used as input of a support vector machine to classify signals into 8 different emotions (calm, happy, sad, angry, fearful, disgust and surprised)

    Towards Emotion Recognition: A Persistent Entropy Application

    Full text link
    Emotion recognition and classification is a very active area of research. In this paper, we present a first approach to emotion classification using persistent entropy and support vector machines. A topology-based model is applied to obtain a single real number from each raw signal. These data are used as input of a support vector machine to classify signals into 8 different emotions (calm, happy, sad, angry, fearful, disgust and surprised)

    Facial Expression Recognition from World Wild Web

    Full text link
    Recognizing facial expression in a wild setting has remained a challenging task in computer vision. The World Wide Web is a good source of facial images which most of them are captured in uncontrolled conditions. In fact, the Internet is a Word Wild Web of facial images with expressions. This paper presents the results of a new study on collecting, annotating, and analyzing wild facial expressions from the web. Three search engines were queried using 1250 emotion related keywords in six different languages and the retrieved images were mapped by two annotators to six basic expressions and neutral. Deep neural networks and noise modeling were used in three different training scenarios to find how accurately facial expressions can be recognized when trained on noisy images collected from the web using query terms (e.g. happy face, laughing man, etc)? The results of our experiments show that deep neural networks can recognize wild facial expressions with an accuracy of 82.12%

    Face Expression Classification in Children Using CNN

    Get PDF
    One of the turbulent emotions can be recognized from facial expressions. When compared with adults, children's facial expressions are more expressive for positive emotions and ambiguous for negative emotions so that they are much more difficult to recognize. Ambiguous in terms of negative emotions, for example, when children are angry, sometimes they show an expressionless face, making it difficult to know what emotions the child is experiencing. Therefore, it is proposed research using Convolutional Neural Network with ResNet-50 architecture. According to [1] CNN Resnet-50 is superior to other facial recognition methods, specifically in the classification of facial expressions. CNN ResNet-50 generates a model during the training process, and the model will be used during the testing process. The dataset used is Children's Spontaneous facial Expressions (LIRIS-CSE) data proposed by [2]. CNN ResNet-50 can identify children's expressions well, including expressions of anger, disgust, fear, happy, sad and surprise. The results showed a very significant increase in accuracy, namely in testing data testing reached 99.89%

    Emotional Chatting Machine: Emotional Conversation Generation with Internal and External Memory

    Full text link
    Perception and expression of emotion are key factors to the success of dialogue systems or conversational agents. However, this problem has not been studied in large-scale conversation generation so far. In this paper, we propose Emotional Chatting Machine (ECM) that can generate appropriate responses not only in content (relevant and grammatical) but also in emotion (emotionally consistent). To the best of our knowledge, this is the first work that addresses the emotion factor in large-scale conversation generation. ECM addresses the factor using three new mechanisms that respectively (1) models the high-level abstraction of emotion expressions by embedding emotion categories, (2) captures the change of implicit internal emotion states, and (3) uses explicit emotion expressions with an external emotion vocabulary. Experiments show that the proposed model can generate responses appropriate not only in content but also in emotion.Comment: Accepted in AAAI 201
    corecore