649 research outputs found

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Biosignals as an Advanced Man-Machine Interface

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such an MMI requires the correct classification of biosignals to emotion classes. This paper explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 24 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for both personalized biosignal-profiles and the recording of multiple biosignals in parallel

    Bimodal Emotion Recognition using Speech and Physiological Changes

    Get PDF
    With exponentially evolving technology it is no exaggeration to say that any interface fo

    Beyond Biometrics

    Get PDF
    Throughout the last 40 years, the essence of automated identification of users has remained the same. In this article, a new class of biometrics is proposed that is founded on processing biosignals, as opposed to images. After a brief introduction on biometrics, biosignals are discussed, including their advantages, disadvantages, and guidelines for obtaining them. This new class of biometrics increases biometrics’ robustness and enables cross validation. Next, biosignals’ use is illustrated by two biosignal-based biometrics: voice identification and handwriting recognition. Additionally, the concept of a digital human model is introduced. Last, some issues will be touched upon that will arise when biosignal-based biometrics are brought to practice

    Ubiquitous emotion-aware computing

    Get PDF
    Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing

    Physiological signal-based emotion recognition from wearable devices

    Get PDF
    The interest in computers recognizing human emotions has been increasing recently. Many studies have been done about recognizing emotions from physical signals such as facial expressions or from written text with good results. However, recognizing emotions from physiological signals such as heart rate, from wearable devices without physical signals have been challenging. Some studies have given good, or at least promising results. The challenge for emotion recognition is to understand how human body actually reacts to different emotional triggers and to find a common factors among people. The aim of this study is to find out whether it is possible to accurately recognize human emotions and stress from physiological signals using supervised machine learning. Further, we consider the question what type of biosignals are most informative for making such predictions. The performance of Support Vector Machines and Random Forest classifiers are experimentally evaluated on the task of separating stress and no-stress signals from three different biosignals: ECG, PPG and EDA. The challenges with these biosginals from acquiring them to pre-processing the signals are addressed and their connection to emotional experience is discussed. In addition, the challenges and problems on experimental setups used in previous studies are addressed and especially the usability problems of the dataset. The models implemented in this thesis were not able to accurately classify emotions using supervised machine learning from the dataset used. The models did not perform remarkably better than just randomly choosing labels. PPG signal however performed slightly better than ECG or EDA for stress detection

    Biosensing and Actuation—Platforms Coupling Body Input-Output Modalities for Affective Technologies

    Get PDF
    Research in the use of ubiquitous technologies, tracking systems and wearables within mental health domains is on the rise. In recent years, affective technologies have gained traction and garnered the interest of interdisciplinary fields as the research on such technologies matured. However, while the role of movement and bodily experience to affective experience is well-established, how to best address movement and engagement beyond measuring cues and signals in technology-driven interactions has been unclear. In a joint industry-academia effort, we aim to remodel how affective technologies can help address body and emotional self-awareness. We present an overview of biosignals that have become standard in low-cost physiological monitoring and show how these can be matched with methods and engagements used by interaction designers skilled in designing for bodily engagement and aesthetic experiences. Taking both strands of work together offers unprecedented design opportunities that inspire further research. Through first-person soma design, an approach that draws upon the designer’s felt experience and puts the sentient body at the forefront, we outline a comprehensive work for the creation of novel interactions in the form of couplings that combine biosensing and body feedback modalities of relevance to affective health. These couplings lie within the creation of design toolkits that have the potential to render rich embodied interactions to the designer/user. As a result we introduce the concept of “orchestration”. By orchestration, we refer to the design of the overall interaction: coupling sensors to actuation of relevance to the affective experience; initiating and closing the interaction; habituating; helping improve on the users’ body awareness and engagement with emotional experiences; soothing, calming, or energising, depending on the affective health condition and the intentions of the designer. Through the creation of a range of prototypes and couplings we elicited requirements on broader orchestration mechanisms. First-person soma design lets researchers look afresh at biosignals that, when experienced through the body, are called to reshape affective technologies with novel ways to interpret biodata, feel it, understand it and reflect upon our bodies

    Multimodal emotion evaluation: a physiological model for cost-effective emotion classification

    Get PDF
    Emotional responses are associated with distinct body alterations and are crucial to foster adaptive responses, well-being, and survival. Emotion identification may improve peoples' emotion regulation strategies and interaction with multiple life contexts. Several studies have investigated emotion classification systems, but most of them are based on the analysis of only one, a few, or isolated physiological signals. Understanding how informative the individual signals are and how their combination works would allow to develop more cost-effective, informative, and objective systems for emotion detection, processing, and interpretation. In the present work, electrocardiogram, electromyogram, and electrodermal activity were processed in order to find a physiological model of emotions. Both a unimodal and a multimodal approach were used to analyze what signal, or combination of signals, may better describe an emotional response, using a sample of 55 healthy subjects. The method was divided in: (1) signal preprocessing; (2) feature extraction; (3) classification using random forest and neural networks. Results suggest that the electrocardiogram (ECG) signal is the most effective for emotion classification. Yet, the combination of all signals provides the best emotion identification performance, with all signals providing crucial information for the system. This physiological model of emotions has important research and clinical implications, by providing valuable information about the value and weight of physiological signals for emotional classification, which can critically drive effective evaluation, monitoring and intervention, regarding emotional processing and regulation, considering multiple contexts.publishe
    corecore