5,701 research outputs found

    Multi-modal Approach for Affective Computing

    Full text link
    Throughout the past decade, many studies have classified human emotions using only a single sensing modality such as face video, electroencephalogram (EEG), electrocardiogram (ECG), galvanic skin response (GSR), etc. The results of these studies are constrained by the limitations of these modalities such as the absence of physiological biomarkers in the face-video analysis, poor spatial resolution in EEG, poor temporal resolution of the GSR etc. Scant research has been conducted to compare the merits of these modalities and understand how to best use them individually and jointly. Using multi-modal AMIGOS dataset, this study compares the performance of human emotion classification using multiple computational approaches applied to face videos and various bio-sensing modalities. Using a novel method for compensating physiological baseline we show an increase in the classification accuracy of various approaches that we use. Finally, we present a multi-modal emotion-classification approach in the domain of affective computing research.Comment: Published in IEEE 40th International Engineering in Medicine and Biology Conference (EMBC) 201

    A dataset of continuous affect annotations and physiological signals for emotion analysis

    Get PDF
    From a computational viewpoint, emotions continue to be intriguingly hard to understand. In research, direct, real-time inspection in realistic settings is not possible. Discrete, indirect, post-hoc recordings are therefore the norm. As a result, proper emotion assessment remains a problematic issue. The Continuously Annotated Signals of Emotion (CASE) dataset provides a solution as it focusses on real-time continuous annotation of emotions, as experienced by the participants, while watching various videos. For this purpose, a novel, intuitive joystick-based annotation interface was developed, that allowed for simultaneous reporting of valence and arousal, that are instead often annotated independently. In parallel, eight high quality, synchronized physiological recordings (1000 Hz, 16-bit ADC) were made of ECG, BVP, EMG (3x), GSR (or EDA), respiration and skin temperature. The dataset consists of the physiological and annotation data from 30 participants, 15 male and 15 female, who watched several validated video-stimuli. The validity of the emotion induction, as exemplified by the annotation and physiological data, is also presented.Comment: Dataset available at: https://rmc.dlr.de/download/CASE_dataset/CASE_dataset.zi

    Social interactions, emotion and sleep: a systematic review and research agenda

    Get PDF
    Sleep and emotion are closely linked, however the effects of sleep on socio-emotional task performance have only recently been investigated. Sleep loss and insomnia have been found to affect emotional reactivity and social functioning, although results, taken together, are somewhat contradictory. Here we review this advancing literature, aiming to 1) systematically review the relevant literature on sleep and socio-emotional functioning, with reference to the extant literature on emotion and social interactions, 2) summarize results and outline ways in which emotion, social interactions, and sleep may interact, and 3) suggest key limitations and future directions for this field. From the reviewed literature, sleep deprivation is associated with diminished emotional expressivity and impaired emotion recognition, and this has particular relevance for social interactions. Sleep deprivation also increases emotional reactivity; results which are most apparent with neuro-imaging studies investigating amygdala activity and its prefrontal regulation. Evidence of emotional dysregulation in insomnia and poor sleep has also been reported. In general, limitations of this literature include how performance measures are linked to self-reports, and how results are linked to socio-emotional functioning. We conclude by suggesting some possible future directions for this field

    Emotion Recognition using Wireless Signals

    Get PDF
    This paper demonstrates a new technology that can infer a person's emotions from RF signals reflected off his body. EQ-Radio transmits an RF signal and analyzes its reflections off a person's body to recognize his emotional state (happy, sad, etc.). The key enabler underlying EQ-Radio is a new algorithm for extracting the individual heartbeats from the wireless signal at an accuracy comparable to on-body ECG monitors. The resulting beats are then used to compute emotion-dependent features which feed a machine-learning emotion classifier. We describe the design and implementation of EQ-Radio, and demonstrate through a user study that its emotion recognition accuracy is on par with state-of-the-art emotion recognition systems that require a person to be hooked to an ECG monitor. Keywords: Wireless Signals; Wireless Sensing; Emotion Recognition; Affective Computing; Heart Rate VariabilityNational Science Foundation (U.S.)United States. Air Forc
    corecore