6,829 research outputs found

    Affective automotive user interfaces

    Get PDF
    Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natürliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen über den emotionalen Zustand des Nutzers natürlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen. Dabei ermöglicht Emotionserkennung natürliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern. Diese Dissertation beleuchtet Affective Automotive User Interfaces – zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug – auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern. Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein großes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer Gefühlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern Unterstützung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife. Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen großen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfügbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoßen wir außerdem auf kulturelle und demografische Einflüsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt. Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukünftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen

    Affective automotive user interfaces

    Get PDF
    Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions. Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driver’s state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving. This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience. Promoting safe behavior through emotion regulation: Systems which detect and react to the driver’s state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology. Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the user’s preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars. We argue that the future of user-aware interaction lies in adapting not only to the driver’s preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natürliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen über den emotionalen Zustand des Nutzers natürlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen. Dabei ermöglicht Emotionserkennung natürliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern. Diese Dissertation beleuchtet Affective Automotive User Interfaces – zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug – auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern. Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein großes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer Gefühlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern Unterstützung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife. Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen großen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfügbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoßen wir außerdem auf kulturelle und demografische Einflüsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt. Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukünftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen

    17 Human-Car confluence: “Socially-Inspired driving mechanisms”

    Get PDF
    With self-driving vehicles announced for the 2020s, today’s challenges in Intelligent Transportation Systems (ITS) lie in problems related to negotiation and decision making in (spontaneously formed) car collectives. Due to the close coupling and interconnectedness of the involved driver-vehicle entities, effects on the local level induced by cognitive capacities, behavioral patterns, and the social context of drivers, would directly cause changes on the macro scale. To illustrate, a driver’s fatigue or emotion can influence a local driver-vehicle feedback loop, which is directly translated into his or her driving style, and, in turn, can affect driving styles of all nearby drivers. These transitional, yet collective driver state and driving style changes raise global traffic phenomena like jams, collective aggressiveness, etc. To allow harmonic coexistence of autonomous and self-driven vehicles, we investigate in this chapter the effects of socially-inspired driving and discuss the potential and beneficial effects its application should have on collective traffic

    On driver behavior recognition for increased safety:A roadmap

    Get PDF
    Advanced Driver-Assistance Systems (ADASs) are used for increasing safety in the automotive domain, yet current ADASs notably operate without taking into account drivers’ states, e.g., whether she/he is emotionally apt to drive. In this paper, we first review the state-of-the-art of emotional and cognitive analysis for ADAS: we consider psychological models, the sensors needed for capturing physiological signals, and the typical algorithms used for human emotion classification. Our investigation highlights a lack of advanced Driver Monitoring Systems (DMSs) for ADASs, which could increase driving quality and security for both drivers and passengers. We then provide our view on a novel perception architecture for driver monitoring, built around the concept of Driver Complex State (DCS). DCS relies on multiple non-obtrusive sensors and Artificial Intelligence (AI) for uncovering the driver state and uses it to implement innovative Human–Machine Interface (HMI) functionalities. This concept will be implemented and validated in the recently EU-funded NextPerception project, which is briefly introduced

    Towards a Personalized Multi-Domain Digital Neurophenotyping Model for the Detection and Treatment of Mood Trajectories

    Get PDF
    The commercial availability of many real-life smart sensors, wearables, and mobile apps provides a valuable source of information about a wide range of human behavioral, physiological, and social markers that can be used to infer the user’s mental state and mood. However, there are currently no commercial digital products that integrate these psychosocial metrics with the real-time measurement of neural activity. In particular, electroencephalography (EEG) is a well-validated and highly sensitive neuroimaging method that yields robust markers of mood and affective processing, and has been widely used in mental health research for decades. The integration of wearable neuro-sensors into existing multimodal sensor arrays could hold great promise for deep digital neurophenotyping in the detection and personalized treatment of mood disorders. In this paper, we propose a multi-domain digital neurophenotyping model based on the socioecological model of health. The proposed model presents a holistic approach to digital mental health, leveraging recent neuroscientific advances, and could deliver highly personalized diagnoses and treatments. The technological and ethical challenges of this model are discussed

    Tune in to your emotions: a robust personalized affective music player

    Get PDF
    The emotional power of music is exploited in a personalized affective music player (AMP) that selects music for mood enhancement. A biosignal approach is used to measure listeners’ personal emotional reactions to their own music as input for affective user models. Regression and kernel density estimation are applied to model the physiological changes the music elicits. Using these models, personalized music selections based on an affective goal state can be made. The AMP was validated in real-world trials over the course of several weeks. Results show that our models can cope with noisy situations and handle large inter-individual differences in the music domain. The AMP augments music listening where its techniques enable automated affect guidance. Our approach provides valuable insights for affective computing and user modeling, for which the AMP is a suitable carrier application

    Emotion-aware voice interfaces based on speech signal processing

    Get PDF
    Voice interfaces (VIs) will become increasingly widespread in current daily lives as AI techniques progress. VIs can be incorporated into smart devices like smartphones, as well as integrated into autos, home automation systems, computer operating systems, and home appliances, among other things. Current speech interfaces, however, are unaware of users’ emotional states and hence cannot support real communication. To overcome these limitations, it is necessary to implement emotional awareness in future VIs. This thesis focuses on how speech signal processing (SSP) and speech emotion recognition (SER) can enable VIs to gain emotional awareness. Following an explanation of what emotion is and how neural networks are implemented, this thesis presents the results of several user studies and surveys. Emotions are complicated, and they are typically characterized using category and dimensional models. They can be expressed verbally or nonverbally. Although existing voice interfaces are unaware of users’ emotional states and cannot support natural conversations, it is possible to perceive users’ emotions by speech based on SSP in future VIs. One section of this thesis, based on SSP, investigates mental restorative eïŹ€ects on humans and their measures from speech signals. SSP is less intrusive and more accessible than traditional measures such as attention scales or response tests, and it can provide a reliable assessment for attention and mental restoration. SSP can be implemented into future VIs and utilized in future HCI user research. The thesis then moves on to present a novel attention neural network based on sparse correlation features. The detection accuracy of emotions in the continuous speech was demonstrated in a user study utilizing recordings from a real classroom. In this section, a promising result will be shown. In SER research, it is unknown if existing emotion detection methods detect acted emotions or the genuine emotion of the speaker. Another section of this thesis is concerned with humans’ ability to act on their emotions. In a user study, participants were instructed to imitate five fundamental emotions. The results revealed that they struggled with this task; nevertheless, certain emotions were easier to replicate than others. A further study concern is how VIs should respond to users’ emotions if SER techniques are implemented in VIs and can recognize users’ emotions. The thesis includes research on ways for dealing with the emotions of users. In a user study, users were instructed to make sad, angry, and terrified VI avatars happy and were asked if they would like to be treated the same way if the situation were reversed. According to the results, the majority of participants tended to respond to these unpleasant emotions with neutral emotion, but there is a diïŹ€erence among genders in emotion selection. For a human-centered design approach, it is important to understand what the users’ preferences for future VIs are. In three distinct cultures, a questionnaire-based survey on users’ attitudes and preferences for emotion-aware VIs was conducted. It was discovered that there are almost no gender diïŹ€erences. Cluster analysis found that there are three fundamental user types that exist in all cultures: Enthusiasts, Pragmatists, and Sceptics. As a result, future VI development should consider diverse sorts of consumers. In conclusion, future VIs systems should be designed for various sorts of users as well as be able to detect the users’ disguised or actual emotions using SER and SSP technologies. Furthermore, many other applications, such as restorative eïŹ€ects assessments, can be included in the VIs system

    Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing

    Full text link
    [EN] Emotions play a critical role in our daily lives, so the understanding and recognition of emotional responses is crucial for human research. Affective computing research has mostly used non-immersive two-dimensional (2D) images or videos to elicit emotional states. However, immersive virtual reality, which allows researchers to simulate environments in controlled laboratory conditions with high levels of sense of presence and interactivity, is becoming more popular in emotion research. Moreover, its synergy with implicit measurements and machine-learning techniques has the potential to impact transversely in many research areas, opening new opportunities for the scientific community. This paper presents a systematic review of the emotion recognition research undertaken with physiological and behavioural measures using head-mounted displays as elicitation devices. The results highlight the evolution of the field, give a clear perspective using aggregated analysis, reveal the current open issues and provide guidelines for future research.This research was funded by European Commission, grant number H2020-825585 HELIOS.Marín-Morales, J.; Llinares Millån, MDC.; Guixeres Provinciale, J.; Alcañiz Raya, ML. (2020). Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors. 20(18):1-26. https://doi.org/10.3390/s20185163S126201

    Exploring emotion responses toward pedestrian crossing actions for designing in-vehicle empathic interfaces

    Get PDF
    While affective non-verbal communication between pedestrians and drivers has been shown to improve on-road safety and driving experiences, it remains a challenge to design driver assistance systems that can automatically capture these affective cues. In this early work, we identify users' emotional self-report responses towards commonly occurring pedestrian actions while crossing a road. We conducted a crowd-sourced web-based survey (N=91), where respondents with prior driving experience viewed videos of 25 pedestrian interaction scenarios selected from the JAAD (Joint Attention for Autonomous Driving) dataset, and thereafter provided valence and arousal self-reports. We found participants' emotion self-reports (especially valence) are strongly influenced by actions including hand waving, nodding, impolite hand gestures, and inattentive pedestrian(s) crossing while engaged with a phone. Our findings provide a first step towards designing in-vehicle empathic interfaces that can assist in driver emotion regulation during on-road interactions, where the identified pedestrian actions serve as future driver emotion induction stimuli
    • 

    corecore