1,617 research outputs found

    Affective games:a multimodal classification system

    Get PDF
    Affective gaming is a relatively new field of research that exploits human emotions to influence gameplay for an enhanced player experience. Changes in player’s psychology reflect on their behaviour and physiology, hence recognition of such variation is a core element in affective games. Complementary sources of affect offer more reliable recognition, especially in contexts where one modality is partial or unavailable. As a multimodal recognition system, affect-aware games are subject to the practical difficulties met by traditional trained classifiers. In addition, inherited game-related challenges in terms of data collection and performance arise while attempting to sustain an acceptable level of immersion. Most existing scenarios employ sensors that offer limited freedom of movement resulting in less realistic experiences. Recent advances now offer technology that allows players to communicate more freely and naturally with the game, and furthermore, control it without the use of input devices. However, the affective game industry is still in its infancy and definitely needs to catch up with the current life-like level of adaptation provided by graphics and animation

    Reconnaissance de l'émotion thermique

    Full text link
    Pour améliorer les interactions homme-ordinateur dans les domaines de la santé, de l'e-learning et des jeux vidéos, de nombreux chercheurs ont étudié la reconnaissance des émotions à partir des signaux de texte, de parole, d'expression faciale, de détection d'émotion ou d'électroencéphalographie (EEG). Parmi eux, la reconnaissance d'émotion à l'aide d'EEG a permis une précision satisfaisante. Cependant, le fait d'utiliser des dispositifs d'électroencéphalographie limite la gamme des mouvements de l'utilisateur. Une méthode non envahissante est donc nécessaire pour faciliter la détection des émotions et ses applications. C'est pourquoi nous avons proposé d'utiliser une caméra thermique pour capturer les changements de température de la peau, puis appliquer des algorithmes d'apprentissage machine pour classer les changements d'émotion en conséquence. Cette thèse contient deux études sur la détection d'émotion thermique avec la comparaison de la détection d'émotion basée sur EEG. L'un était de découvrir les profils de détection émotionnelle thermique en comparaison avec la technologie de détection d'émotion basée sur EEG; L'autre était de construire une application avec des algorithmes d'apprentissage en machine profonds pour visualiser la précision et la performance de la détection d'émotion thermique et basée sur EEG. Dans la première recherche, nous avons appliqué HMM dans la reconnaissance de l'émotion thermique, et après avoir comparé à la détection de l'émotion basée sur EEG, nous avons identifié les caractéristiques liées à l'émotion de la température de la peau en termes d'intensité et de rapidité. Dans la deuxième recherche, nous avons mis en place une application de détection d'émotion qui supporte à la fois la détection d'émotion thermique et la détection d'émotion basée sur EEG en appliquant les méthodes d'apprentissage par machine profondes - Réseau Neuronal Convolutif (CNN) et Mémoire à long court-terme (LSTM). La précision de la détection d'émotion basée sur l'image thermique a atteint 52,59% et la précision de la détection basée sur l'EEG a atteint 67,05%. Dans une autre étude, nous allons faire plus de recherches sur l'ajustement des algorithmes d'apprentissage machine pour améliorer la précision de détection d'émotion thermique.To improve computer-human interactions in the areas of healthcare, e-learning and video games, many researchers have studied on recognizing emotions from text, speech, facial expressions, emotion detection, or electroencephalography (EEG) signals. Among them, emotion recognition using EEG has achieved satisfying accuracy. However, wearing electroencephalography devices limits the range of user movement, thus a noninvasive method is required to facilitate the emotion detection and its applications. That’s why we proposed using thermal camera to capture the skin temperature changes and then applying machine learning algorithms to classify emotion changes accordingly. This thesis contains two studies on thermal emotion detection with the comparison of EEG-base emotion detection. One was to find out the thermal emotional detection profiles comparing with EEG-based emotion detection technology; the other was to implement an application with deep machine learning algorithms to visually display both thermal and EEG based emotion detection accuracy and performance. In the first research, we applied HMM in thermal emotion recognition, and after comparing with EEG-base emotion detection, we identified skin temperature emotion-related features in terms of intensity and rapidity. In the second research, we implemented an emotion detection application supporting both thermal emotion detection and EEG-based emotion detection with applying the deep machine learning methods – Convolutional Neutral Network (CNN) and LSTM (Long- Short Term Memory). The accuracy of thermal image based emotion detection achieved 52.59% and the accuracy of EEG based detection achieved 67.05%. In further study, we will do more research on adjusting machine learning algorithms to improve the thermal emotion detection precision

    Face Emotion Recognition Based on Machine Learning: A Review

    Get PDF
    Computers can now detect, understand, and evaluate emotions thanks to recent developments in machine learning and information fusion. Researchers across various sectors are increasingly intrigued by emotion identification, utilizing facial expressions, words, body language, and posture as means of discerning an individual's emotions. Nevertheless, the effectiveness of the first three methods may be limited, as individuals can consciously or unconsciously suppress their true feelings. This article explores various feature extraction techniques, encompassing the development of machine learning classifiers like k-nearest neighbour, naive Bayesian, support vector machine, and random forest, in accordance with the established standard for emotion recognition. The paper has three primary objectives: firstly, to offer a comprehensive overview of effective computing by outlining essential theoretical concepts; secondly, to describe in detail the state-of-the-art in emotion recognition at the moment; and thirdly, to highlight important findings and conclusions from the literature, with an emphasis on important obstacles and possible future paths, especially in the creation of state-of-the-art machine learning algorithms for the identification of emotions

    Emotion-reacting fashion design: intelligent garment and accessory recognizing facial expressions

    Get PDF
    Although mental disorders have emerged as serious social challenges, social stigma, including prejudice and misunderstanding, hinder suitable treatment for the patients. It is crucial to monitor our internal psychological and emotional states to avoid the unconscious progression of mental disorders. This research aims to achieve emotion-reacting garments and accessories, based on a passive and continuous emotion recognition system in real time. First, this study proposes a systematic design for emotion-reacting garments and accessories, which employs emotion estimation based on facial expressions. Next, emotion-reacting fashion design is discussed for intelligent garments and accessories that interact with our bodies and mind. To achieve this system, a functionally extended collar made of transparent polycarbonate material is designed for integration with the digital camera modules. In addition, this study discusses how to create a physical stimulus on emotion-reacting garments and accessories. The intelligent garments and accessories using RGB-LEDs create visual effects that reflect emotions. In terms of audio effects, emotion-related keywords are employed to select the music played in intelligent garments. Finally, prototypes reacting to emotions are show

    IoT DEVELOPMENT FOR HEALTHY INDEPENDENT LIVING

    Get PDF
    The rise of internet connected devices has enabled the home with a vast amount of enhancements to make life more convenient. These internet connected devices can be used to form a community of devices known as the internet of things (IoT). There is great value in IoT devices to promote healthy independent living for older adults. Fall-related injuries has been one of the leading causes of death in older adults. For example, every year more than a third of people over 65 in the U.S. experience a fall, of which up to 30 percent result in moderate to severe injury. Therefore, this thesis proposes an IoT-based fall detection system for smart home environments that not only to send out alerts, but also launches interaction models, such as voice assistance and camera monitoring. Such connectivity could allow older adults to interact with the system without concern of a learning curve. The proposed IoT-based fall detection system will enable family and caregivers to be immediately notified of the event and remotely monitor the individual. Integrated within a smart home environment, the proposed IoT-based fall detection system can improve the quality of life among older adults. Along with the physical concerns of health, psychological stress is also a great concern among older adults. Stress has been linked to emotional and physical conditions such as depression, anxiety, heart attacks, stroke, etc. Increased susceptibility to stress may accelerate cognitive decline resulting in conversion of cognitively normal older adults to MCI (Mild Cognitive Impairment), and MCI to dementia. Thus, if stress can be measured, there can be countermeasures put in place to reduce stress and its negative effects on the psychological and physical health of older adults. This thesis presents a framework that can be used to collect and pre-process physiological data for the purpose of validating galvanic skin response (GSR), heart rate (HR), and emotional valence (EV) measurements against the cortisol and self-reporting benchmarks for stress detection. The results of this framework can be used for feature extraction to feed into a regression model for validating each combination of physiological measurement. Also, the potential of this framework to automate stress protocols like the Trier Social Stress Test (TSST) could pave the way for an IoT-based platform for automated stress detection and management

    Emotional Brain-Computer Interfaces

    Get PDF
    Research in Brain-computer interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide signicant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the inuence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates.\ud These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of noninvasive EEG based BCIs

    Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

    Get PDF
    This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments
    corecore