1,643 research outputs found
Affective automotive user interfaces
Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions.
Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driverâs state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving.
This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience.
Promoting safe behavior through emotion regulation: Systems which detect and react to the driverâs state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology.
Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the userâs preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars.
We argue that the future of user-aware interaction lies in adapting not only to the driverâs preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natuÌrliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen uÌber den emotionalen Zustand des Nutzers natuÌrlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen.
Dabei ermöglicht Emotionserkennung natuÌrliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern.
Diese Dissertation beleuchtet Affective Automotive User Interfaces â zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug â auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern.
Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein groĂes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer GefuÌhlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern UnterstuÌtzung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife.
Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen groĂen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfuÌgbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoĂen wir auĂerdem auf kulturelle und demografische EinfluÌsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt.
Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukuÌnftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen
Affective automotive user interfaces
Technological progress in the fields of ubiquitous sensing and machine learning has been fueling the development of user-aware human-computer interaction in recent years. Especially natural user interfaces, like digital voice assistants, can benefit from understanding their users in order to provide a more naturalistic experience. Such systems can, for example, detect the emotional state of users and accordingly act in an empathic way. One major research field working on this topic is Affective Computing, where psycho-physiological measures, speech input, and facial expressions are used to sense human emotions.
Affective data allows natural user interfaces to respond to emotions, providing promising perspectives not only for user experience design but also for safety aspects. In automotive environments, informed estimations of the driverâs state can potentially avoid dangerous errors and evoking positive emotions can improve the experience of driving.
This dissertation explores Affective Automotive User Interfaces using two basic interaction paradigms: firstly, emotion regulation systems react to the current emotional state of the user based on live sensing data, allowing for quick interventions. Secondly, emotional interaction synthesizes experiences which resonate with the user on an emotional level. The constituted goals of these two interaction approaches are the promotion of safe behavior and an improvement of user experience.
Promoting safe behavior through emotion regulation: Systems which detect and react to the driverâs state are expected to have great potential for improving road safety. This work presents a model and methods needed to investigate such systems and an exploration of several approaches to keep the driver in a safe state. The presented methods include techniques to induce emotions and to sample the emotional state of drivers. Three driving simulator studies investigate the impacts of emotionaware interventions in the form of implicit cues, visual mirroring and empathic speech synthesis. We envision emotion-awareness as a safety feature which can detect if a driver is unfit or in need of support, based on the propagation of robust emotion detection technology.
Improving user experience with emotional interaction: Emotional perception is an essential part of user experience. This thesis entails methods to build emotional experiences derived from a variety of lab and simulator studies, expert feedback, car-storming sessions and design thinking workshops. Systems capable of adapting to the userâs preferences and traits in order to create an emotionally satisfactory user experience do not require the input of emotion detection. They rather create value through general knowledge about the user by adapting the output they generate. During this research, cultural and generational influences became evident, which have to be considered when implementing affective automotive user interfaces in future cars.
We argue that the future of user-aware interaction lies in adapting not only to the driverâs preferences and settings but also to their current state. This paves the way for the regulation of safe behavior, especially in safety-critical environments like cars, and an improvement of the driving experience.Aktuelle Fortschritte in den Bereichen des Machine Learning und Ubiquitous Computing ermöglichen es heute adaptive Mensch-Maschine-Schnittstellen zu realisieren. Vor allem natuÌrliche Interaktion, wie wir sie von Sprachassistenten kennen, profitiert von einem verbesserten VerstĂ€ndnis des Nutzerverhaltens. Zum Beispiel kann ein Assistent mit Informationen uÌber den emotionalen Zustand des Nutzers natuÌrlicher interagieren, vielleicht sogar Empathie zeigen. Affective Computing ist das damit verbundene Forschungsfeld, das sich damit beschĂ€ftigt menschliche Emotionen durch Beobachtung von physiologischen Daten, Sprache und Mimik zu erkennen.
Dabei ermöglicht Emotionserkennung natuÌrliche Interaktion auf Basis des Fahrer/innenzustands, was nicht nur vielversprechend in Bezug auf die Gestaltung des Nutzerelebnisses klingt, sondern auch Anwendungen im Bereich der Verkehrssicherheit hat. Ein Einsatz im Fahrkontext könnte so vermeidbare UnfĂ€lle verringern und gleichzeitig Fahrer durch emotionale Interaktion begeistern.
Diese Dissertation beleuchtet Affective Automotive User Interfaces â zu Deutsch in etwa Emotionsadaptive Benutzerschnittstellen im Fahrzeug â auf Basis zweier inhaltlicher SĂ€ulen: erstens benutzen wir AnsĂ€tze zur Emotionsregulierung, um im Falle gefĂ€hrlicher FahrerzustĂ€nde einzugreifen. Zweitens erzeugen wir emotional aufgeladene Interaktionen, um das Nutzererlebnis zu verbessern.
Erhöhte Sicherheit durch Emotionsregulierung: Emotionsadaptiven Systemen wird ein groĂes Potenzial zur Verbesserung der Verkehrssicherheit zugeschrieben. Wir stellen ein Modell und Methoden vor, die zur Untersuchung solcher Systeme benötigt werden und erforschen AnsĂ€tze, die dazu dienen Fahrer in einer GefuÌhlslage zu halten, die sicheres Handeln erlaubt. Die vorgestellten Methoden beinhalten AnsĂ€tze zur Emotionsinduktion und -erkennung, sowie drei Fahrsimulatorstudien zur Beeinflussung von Fahrern durch indirekte Reize, Spiegeln von Emotionen und empathischer Sprachinteraktion. Emotionsadaptive Sicherheitssysteme können in Zukunft beeintrĂ€chtigten Fahrern UnterstuÌtzung leisten und so den Verkehr sicherer machen, vorausgesetzt die technischen Grundlagen der Emotionserkennung gewinnen an Reife.
Verbesserung des Nutzererlebnisses durch emotionale Interaktion: Emotionen tragen einen groĂen Teil zum Nutzerlebnis bei, darum ist es nur sinnvoll den zweiten Fokuspunkt dieser Arbeit auf systeminitiierte emotionale Interaktion zu legen.Wir stellen die Ergebnisse nutzerzentrierter Ideenfindung und mehrer Evaluationsstudien der resultierenden Systeme vor. Um sich den Vorlieben und Eigenschaften von Nutzern anzupassen wird nicht zwingend Emotionserkennung benötigt. Der Mehrwert solcher Systeme besteht vielmehr darin, auf Basis verfuÌgbarer Verhaltensdaten ein emotional anspruchsvolles Erlebnis zu ermöglichen. In unserer Arbeit stoĂen wir auĂerdem auf kulturelle und demografische EinfluÌsse, die es bei der Gestaltung von emotionsadaptiven Nutzerschnittstellen zu beachten gibt.
Wir sehen die Zukunft nutzeradaptiver Interaktion im Fahrzeug nicht in einer rein verhaltensbasierten Anpassung, sondern erwarten ebenso emotionsbezogene Innovationen. Dadurch können zukuÌnftige Systeme sicherheitsrelevantes Verhalten regulieren und gleichzeitig das Fortbestehen der Freude am Fahren ermöglichen
SVG for Automotive User Interfaces
International audienceIn car cockpits, a wide range of graphic displays, from the low-end multi-functional devices to the most advanced reconfigurable clusters, represent an increasing part of the on-board information systems. We address within the EDONA HMI project the modeling of such human-machine interfaces (HMIs) and the development of an integrated HMI design environment that would improve current development practices. In this article we specifically discuss modeling issues: we explain why the SVG format was selected as the basis of the HMI graphic content description and present domain-specific extensions, mostly related to the HMI functional description, that provide support for a consistent modeling of HMI components
Future cars as a space for work & play
The objective of this CHI course is to provide CHI attendees with an introduction and overview of the rapidly evolving field of automotive user interfaces (AutomotiveUI). The course will focus on UI aspects in the transition towards automated driving. In particular, we will also discuss the opportunities of cars as a new space for non-driving-related activities, such as work, relaxation, and play. For newcomers and experts of other HCI fields, we will present the special properties of this field of HCI and provide an overview of new opportunities, but also general design and evaluation aspects of novel automotive user interfaces
CUI @ Auto-UI:Exploring the Fortunate and Unfortunate Futures of Conversational Automotive User Interfaces
This work aims to connect the Automotive User Interfaces (Auto-UI) and
Conversational User Interfaces (CUI) communities through discussion of their
shared view of the future of automotive conversational user interfaces. The
workshop aims to encourage creative consideration of optimistic and pessimistic
futures, encouraging attendees to explore the opportunities and barriers that
lie ahead through a game. Considerations of the future will be mapped out in
greater detail through the drafting of research agendas, by which attendees
will get to know each other's expertise and networks of resources. The two day
workshop, consisting of two 90-minute sessions, will facilitate greater
communication and collaboration between these communities, connecting
researchers to work together to influence the futures they imagine in the
workshop.Comment: Workshop published and presented at Automotive User Interfaces 2021
(AutoUI 21
An Evaluation of Input Controls for In-Car Interactions
The way drivers operate in-car systems is rapidly changing as traditional physical controls, such as buttons and dials, are being replaced by touchscreens and touch-sensing surfaces. This has the potential to increase driver distraction and error as controls may be harder to find and use. This paper presents an in-car, on the road driving study which examined three key types of input controls to investigate their effects: a physical dial, pressure-based input on a touch surface and touch input on a touchscreen. The physical dial and pressure-based input were also evaluated with and without haptic feedback. The study was conducted with users performing a list-based targeting task using the different controls while driving on public roads. Eye-gaze was recorded to measure distraction from the primary task of driving. The results showed that target accuracy was high across all input methods (greater than 94%). Pressure-based targeting was the slowest while directly tapping on the targets was the faster selection method. Pressure-based input also caused the largest number of glances towards to the touchscreen but the duration of each glance was shorter than directly touching the screen. Our study will enable designers to make more appropriate design choices for future in-car interactions
An Evaluation of Touch and Pressure-Based Scrolling and Haptic Feedback for In-car Touchscreens
An in-car study was conducted to examine different input techniques for list-based scrolling tasks and the effectiveness of haptic feedback for in-car touchscreens. The use of physical switchgear on centre consoles is decreasing which allows designers to develop new ways to interact with in-car applications. However, these new methods need to be evaluated to ensure they are usable. Therefore, three input techniques were tested: direct scrolling, pressure-based scrolling and scrolling using onscreen buttons on a touchscreen. The results showed that direct scrolling was less accurate than using onscreen buttons and pressure input, but took almost half the time when compared to the onscreen buttons and was almost three times quicker than pressure input. Vibrotactile feedback did not improve input performance but was preferred by the users. Understanding the speed vs. accuracy trade-off between these input techniques will allow better decisions when designing safer in-car interfaces for scrolling applications
Evaluating Multimodal Driver Displays of Varying Urgency
Previous studies have evaluated Audio, Visual and Tactile warnings for drivers, highlighting the importance of conveying the appropriate level of urgency through the signals. However, these modalities have never been combined exhaustively with different urgency levels and tested while using a driving simulator. This paper describes two experiments investigating all multimodal combinations of such warnings along three different levels of designed urgency. The warnings were first evaluated in terms of perceived urgency and perceived annoyance in the context of a driving simulator. The results showed that the perceived urgency matched the designed urgency of the warnings. More urgent warnings were also rated as more annoying but the effect of annoyance was lower compared to urgency. The warnings were then tested for recognition time when presented during a simulated driving task. It was found that warnings of high urgency induced quicker and more accurate responses than warnings of medium and of low urgency. In both studies, the number of modalities used in warnings (one, two or three) affected both subjective and objective responses. More modalities led to higher ratings of urgency and annoyance, with annoyance having a lower effect compared to urgency. More modalities also led to quicker responses. These results provide implications for multimodal warning design and reveal how modalities and modality combinations can influence participant responses during a simulated driving task
Sustainability, transport and design: reviewing the prospects for safely encouraging eco-driving
Private vehicle use contributes a disproportionately large amount to the degradation of the environment we inhabit. Technological advancement is of course critical to the mitigation of climate change, however alone it will not suffice; we must also see behavioural change. This paper will argue for the application of Ergonomics to the design of private vehicles, particularly low-carbon vehicles (e.g. hybrid and electric), to encourage this behavioural change. A brief review of literature is offered concerning the effect of the design of a technological object on behaviour, the inter-related nature of goals and feedback in guiding performance, the effect on fuel economy of different driving styles, and the various challenges brought by hybrid and electric vehicles, including range anxiety, workload and distraction, complexity, and novelty. This is followed by a discussion on the potential applicability of a particular design framework, namely Ecological Interface Design, to the design of in-vehicle interfaces that encourage energy-conserving driving behaviours whilst minimising distraction and workload, thus ensuring safety
- âŠ