2,430 research outputs found

    Critical Analysis on Multimodal Emotion Recognition in Meeting the Requirements for Next Generation Human Computer Interactions

    Get PDF
    Emotion recognition is the gap in today’s Human Computer Interaction (HCI). These systems lack the ability to effectively recognize, express and feel emotion limits in their human interaction. They still lack the better sensitivity to human emotions. Multi modal emotion recognition attempts to addresses this gap by measuring emotional state from gestures, facial expressions, acoustic characteristics, textual expressions. Multi modal data acquired from video, audio, sensors etc. are combined using various techniques to classify basis human emotions like happiness, joy, neutrality, surprise, sadness, disgust, fear, anger etc. This work presents a critical analysis of multi modal emotion recognition approaches in meeting the requirements of next generation human computer interactions. The study first explores and defines the requirements of next generation human computer interactions and critically analyzes the existing multi modal emotion recognition approaches in addressing those requirements

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Ubiquitous Technologies for Emotion Recognition

    Get PDF
    Emotions play a very important role in how we think and behave. As such, the emotions we feel every day can compel us to act and influence the decisions and plans we make about our lives. Being able to measure, analyze, and better comprehend how or why our emotions may change is thus of much relevance to understand human behavior and its consequences. Despite the great efforts made in the past in the study of human emotions, it is only now, with the advent of wearable, mobile, and ubiquitous technologies, that we can aim to sense and recognize emotions, continuously and in real time. This book brings together the latest experiences, findings, and developments regarding ubiquitous sensing, modeling, and the recognition of human emotions

    Affective automotive user interfaces

    Get PDF
    Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natürliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen über den emotionalen Zustand des Nutzers natürlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen. Dabei ermöglicht Emotionserkennung natürliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern. Diese Dissertation beleuchtet Affective Automotive User Interfaces – zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug – auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern. Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein großes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer Gefühlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern Unterstützung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife. Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen großen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfügbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoßen wir außerdem auf kulturelle und demografische Einflüsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt. Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukünftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen

    Approaches, applications, and challenges in physiological emotion recognition — a tutorial overview

    Get PDF
    An automatic emotion recognition system can serve as a fundamental framework for various applications in daily life from monitoring emotional well-being to improving the quality of life through better emotion regulation. Understanding the process of emotion manifestation becomes crucial for building emotion recognition systems. An emotional experience results in changes not only in interpersonal behavior but also in physiological responses. Physiological signals are one of the most reliable means for recognizing emotions since individuals cannot consciously manipulate them for a long duration. These signals can be captured by medical-grade wearable devices, as well as commercial smart watches and smart bands. With the shift in research direction from laboratory to unrestricted daily life, commercial devices have been employed ubiquitously. However, this shift has introduced several challenges, such as low data quality, dependency on subjective self-reports, unlimited movement-related changes, and artifacts in physiological signals. This tutorial provides an overview of practical aspects of emotion recognition, such as experiment design, properties of different physiological modalities, existing datasets, suitable machine learning algorithms for physiological data, and several applications. It aims to provide the necessary psychological and physiological backgrounds through various emotion theories and the physiological manifestation of emotions, thereby laying a foundation for emotion recognition. Finally, the tutorial discusses open research directions and possible solutions

    Affective automotive user interfaces

    Get PDF
    Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natürliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen über den emotionalen Zustand des Nutzers natürlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen. Dabei ermöglicht Emotionserkennung natürliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern. Diese Dissertation beleuchtet Affective Automotive User Interfaces – zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug – auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern. Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein großes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer Gefühlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern Unterstützung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife. Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen großen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfügbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoßen wir außerdem auf kulturelle und demografische Einflüsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt. Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukünftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen

    Brain Computer Interfaces and Emotional Involvement: Theory, Research, and Applications

    Get PDF
    This reprint is dedicated to the study of brain activity related to emotional and attentional involvement as measured by Brain–computer interface (BCI) systems designed for different purposes. A BCI system can translate brain signals (e.g., electric or hemodynamic brain activity indicators) into a command to execute an action in the BCI application (e.g., a wheelchair, the cursor on the screen, a spelling device or a game). These tools have the advantage of having real-time access to the ongoing brain activity of the individual, which can provide insight into the user’s emotional and attentional states by training a classification algorithm to recognize mental states. The success of BCI systems in contemporary neuroscientific research relies on the fact that they allow one to “think outside the lab”. The integration of technological solutions, artificial intelligence and cognitive science allowed and will allow researchers to envision more and more applications for the future. The clinical and everyday uses are described with the aim to invite readers to open their minds to imagine potential further developments
    • 

    corecore