1,648 research outputs found
Affective Man-Machine Interface: Unveiling human emotions through biosignals
As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals
Defining, measuring, and modeling passenger's in-vehicle experience and acceptance of automated vehicles
Automated vehicle acceptance (AVA) has been measured mostly subjectively by
questionnaires and interviews, with a main focus on drivers inside automated
vehicles (AVs). To ensure that AVs are widely accepted by the public, ensuring
the acceptance by both drivers and passengers is key. The in-vehicle experience
of passengers will determine the extent to which AVs will be accepted by
passengers. A comprehensive understanding of potential assessment methods to
measure the passenger experience in AVs is needed to improve the in-vehicle
experience of passengers and thereby the acceptance. The present work provides
an overview of assessment methods that were used to measure a driver's
behavior, and cognitive and emotional states during (automated) driving. The
results of the review have shown that these assessment methods can be
classified by type of data-collection method (e.g., questionnaires, interviews,
direct input devices, sensors), object of their measurement (i.e., perception,
behavior, state), time of measurement, and degree of objectivity of the data
collected. A conceptual model synthesizes the results of the literature review,
formulating relationships between the factors constituting the in-vehicle
experience and AVA acceptance. It is theorized that the in-vehicle experience
influences the intention to use, with intention to use serving as predictor of
actual use. The model also formulates relationships between actual use and
well-being. A combined approach of using both subjective and objective
assessment methods is needed to provide more accurate estimates for AVA, and
advance the uptake and use of AVs.Comment: 22 pages, 1 figur
Ubiquitous emotion-aware computing
Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing
Applications of Affective Computing in Human-Robot Interaction: state-of-art and challenges for manufacturing
The introduction of collaborative robots aims to make production more flexible, promoting a greater interaction between humans and robots also from physical point of view. However, working closely with a robot may lead to the creation of stressful situations for the operator, which can negatively affect task performance.
In Human-Robot Interaction (HRI), robots are expected to be socially intelligent, i.e., capable of understanding and reacting accordingly to human social and affective clues. This ability can be exploited implementing affective computing, which concerns the development of systems able to recognize, interpret, process, and simulate human affects. Social intelligence is essential for robots to establish a natural interaction with people in several contexts, including the manufacturing sector with the emergence of Industry 5.0.
In order to take full advantage of the human-robot collaboration, the robotic system should be able to perceive the psycho-emotional and mental state of the operator through different sensing modalities (e.g., facial expressions, body language, voice, or physiological signals) and to adapt its behaviour accordingly. The development of socially intelligent collaborative robots in the manufacturing sector can lead to a symbiotic human-robot collaboration, arising several research challenges that still need to be addressed.
The goals of this paper are the following: (i) providing an overview of affective computing implementation in HRI; (ii) analyzing the state-of-art on this topic in different application contexts (e.g., healthcare, service applications, and manufacturing); (iii) highlighting research challenges for the manufacturing sector
- …