78,077 research outputs found

    Predicting and improving the recognition of emotions

    Get PDF
    The technological world is moving towards more effective and friendly human computer interaction. A key factor of these emerging requirements is the ability of future systems to recognise human emotions, since emotional information is an important part of human-human communication and is therefore expected to be essential in natural and intelligent human-computer interaction. Extensive research has been done on emotion recognition using facial expressions, but all of these methods rely mainly on the results of some classifier based on the apparent expressions. However, the results of classifier may be badly affected by the noise including occlusions, inappropriate lighting conditions, sudden movement of head and body, talking, and other possible problems. In this paper, we propose a system using exponential moving averages and Markov chain to improve the classifier results and somewhat predict the future emotions by taking into account the current as well as previous emotions

    Recognition of Facial Expressions by Cortical Multi-scale Line and Edge Coding

    Get PDF
    Face-to-face communications between humans involve emotions, which often are unconsciously conveyed by facial expressions and body gestures. Intelligent human-machine interfaces, for example in cognitive robotics, need to recognize emotions. This paper addresses facial expressions and their neural correlates on the basis of a model of the visual cortex: the multi-scale line and edge coding. The recognition model links the cortical representation with Paul Ekman's Action Units which are related to the different facial muscles. The model applies a top-down categorization with trends and magnitudes of displacements of the mouth and eyebrows based on expected displacements relative to a neutral expression. The happy vs. not-happy categorization yielded a. correct recognition rate of 91%, whereas final recognition of the six expressions happy, anger, disgust, fear, sadness and surprise resulted in a. rate of 78%

    Emotion-reacting fashion design: intelligent garment and accessory recognizing facial expressions

    Get PDF
    Although mental disorders have emerged as serious social challenges, social stigma, including prejudice and misunderstanding, hinder suitable treatment for the patients. It is crucial to monitor our internal psychological and emotional states to avoid the unconscious progression of mental disorders. This research aims to achieve emotion-reacting garments and accessories, based on a passive and continuous emotion recognition system in real time. First, this study proposes a systematic design for emotion-reacting garments and accessories, which employs emotion estimation based on facial expressions. Next, emotion-reacting fashion design is discussed for intelligent garments and accessories that interact with our bodies and mind. To achieve this system, a functionally extended collar made of transparent polycarbonate material is designed for integration with the digital camera modules. In addition, this study discusses how to create a physical stimulus on emotion-reacting garments and accessories. The intelligent garments and accessories using RGB-LEDs create visual effects that reflect emotions. In terms of audio effects, emotion-related keywords are employed to select the music played in intelligent garments. Finally, prototypes reacting to emotions are show

    Facial emotion recognition using min-max similarity classifier

    Full text link
    Recognition of human emotions from the imaging templates is useful in a wide variety of human-computer interaction and intelligent systems applications. However, the automatic recognition of facial expressions using image template matching techniques suffer from the natural variability with facial features and recording conditions. In spite of the progress achieved in facial emotion recognition in recent years, the effective and computationally simple feature selection and classification technique for emotion recognition is still an open problem. In this paper, we propose an efficient and straightforward facial emotion recognition algorithm to reduce the problem of inter-class pixel mismatch during classification. The proposed method includes the application of pixel normalization to remove intensity offsets followed-up with a Min-Max metric in a nearest neighbor classifier that is capable of suppressing feature outliers. The results indicate an improvement of recognition performance from 92.85% to 98.57% for the proposed Min-Max classification method when tested on JAFFE database. The proposed emotion recognition technique outperforms the existing template matching methods

    Dynamic Facial Expression of Emotion Made Easy

    Full text link
    Facial emotion expression for virtual characters is used in a wide variety of areas. Often, the primary reason to use emotion expression is not to study emotion expression generation per se, but to use emotion expression in an application or research project. What is then needed is an easy to use and flexible, but also validated mechanism to do so. In this report we present such a mechanism. It enables developers to build virtual characters with dynamic affective facial expressions. The mechanism is based on Facial Action Coding. It is easy to implement, and code is available for download. To show the validity of the expressions generated with the mechanism we tested the recognition accuracy for 6 basic emotions (joy, anger, sadness, surprise, disgust, fear) and 4 blend emotions (enthusiastic, furious, frustrated, and evil). Additionally we investigated the effect of VC distance (z-coordinate), the effect of the VC's face morphology (male vs. female), the effect of a lateral versus a frontal presentation of the expression, and the effect of intensity of the expression. Participants (n=19, Western and Asian subjects) rated the intensity of each expression for each condition (within subject setup) in a non forced choice manner. All of the basic emotions were uniquely perceived as such. Further, the blends and confusion details of basic emotions are compatible with findings in psychology

    Affective Medicine: a review of Affective Computing efforts in Medical Informatics

    Get PDF
    Background: Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as “computing that relates to, arises from, or deliberately influences emotions”. AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. Objectives: 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Methods: Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. Results: The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. Conclusions: A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field

    Emotion based Facial Animation using Four Contextual Control Modes

    Get PDF
    An Embodied Conversational Agent (ECA) is an intelligent agent that interacts with users through verbal and nonverbal expressions. When used as the interface of software applications, the presence of these agents creates a positive impact on user experience. Due to their potential in providing online assistance in areas such as E-Commerce, there is an increasing need to make ECAs more believable for the user, which has been achieved mainly by using realistic facial animation and emotions. This thesis presents a new approach of ECA modeling that empowers intelligent agents with synthesized emotions. This approach applies the Contextual Control Model for the construction of an emotion generator that uses information obtained from dialogue to select one of the four modes for the emotion, i.e., Scrambled, Opportunistic, Tactical, and Strategic mode. The emotions are produced in format of the Ortony Clore &Collins (OCC) model for emotion expressions
    • …
    corecore