2,247 research outputs found

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Prerequisites for Affective Signal Processing (ASP) - Part V: A response to comments and suggestions

    Get PDF
    In four papers, a set of eleven prerequisites for affective signal processing (ASP) were identified (van den Broek et al., 2010): validation, triangulation, a physiology-driven approach, contributions of the signal processing community, identification of users, theoretical specification, integration of biosignals, physical characteristics, historical perspective, temporal construction, and real-world baselines. Additionally, a review (in two parts) of affective computing was provided. Initiated by the reactions on these four papers, we now present: i) an extension of the review, ii) a post-hoc analysis based on the eleven prerequisites of Picard et al.(2001), and iii) a more detailed discussion and illustrations of temporal aspects with ASP

    Prerequisites for Affective Signal Processing (ASP) - Part III

    Get PDF
    This is the third part in a series on prerequisites for affective signal processing (ASP). So far, six prerequisites were identified: validation (e.g., mapping of constructs on signals), triangulation, a physiology-driven approach, and contributions of the signal processing community (van den Broek et al., 2009) and identification of users and theoretical specification (van den Broek et al., 2010). Here, two additional prerequisites are identified: integration of biosignals, and physical characteristics

    Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

    Get PDF
    The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel people’s emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when people’s gender is taken into account, the standard deviation of the fundamental frequency of speech indicate people’s experienced emotions. As such, these measures validate each other. Moreover, it is found that people’s environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection

    Biofeedback systems for stress reduction: Towards a Bright Future for a Revitalized Field

    Get PDF
    Stress has recently been baptized as the black death of the 21st century, which illustrates its threat to current health standards. This article proposes biofeedback systems as a means to reduce stress. A concise state-ofthe-art introduction on biofeedback systems is given. The field of mental health informatics is introduced. A compact state-of-the-art introduction on stress (reduction) is provided. A pragmatic solution for the pressing societal problem of illness due to chronic stress is provided in terms of closed loop biofeedback systems. A concise set of such biofeedback systems for stress reduction is presented. We end with the identification of several development phases and ethical concerns

    Ubiquitous emotion-aware computing

    Get PDF
    Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing

    Biosignals as an Advanced Man-Machine Interface

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such an MMI requires the correct classification of biosignals to emotion classes. This paper explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 24 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for both personalized biosignal-profiles and the recording of multiple biosignals in parallel

    Therapy Progress Indicator (TPI): Combining speech parameters and the subjective unit of distress

    Get PDF
    A posttraumatic stress disorder (PTSD) is a severe handicap in daily life and its treatment is complex. To evaluate the success of treatments, an objective and unobtrusive expert system was envisioned: an therapy progress indicator (TPI). Speech was considered as an excellent candidate for providing an objective, unobtrusive emotion measure. Speech of 26 PTSD patients was recorded while they participated in two reliving sessions: re-experiencing their last panic attack and their last joyful occasion. As a subjective measure, the subjective unit of distress was determined, which enabled the validation of derived speech features. A set of parameters of the speech features: signal, power, zero crossing ratio, and pitch, was found to discriminate between the two sessions. A regression model involving these parameters was able to distinguish between positive and negative distress. This model lays the foundation for an TPI for patients with PTSD, which enables objective and unobtrusive evaluations of therapies

    A multimodal dataset for authoring and editing multimedia content:the MAMEM project

    Get PDF
    We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.</p
    corecore