39,906 research outputs found

    Video elicited physiological signal dataset considering indoor temperature factors

    Get PDF
    IntroductionHuman emotions vary with temperature factors. However, most studies on emotion recognition based on physiological signals overlook the influence of temperature factors. This article proposes a video induced physiological signal dataset (VEPT) that considers indoor temperature factors to explore the impact of different indoor temperature factors on emotions.MethodsThis database contains skin current response (GSR) data obtained from 25 subjects at three different indoor temperatures. We selected 25 video clips and 3 temperatures (hot, comfortable, and cold) as motivational materials. Using SVM, LSTM, and ACRNN classification methods, sentiment classification is performed on data under three indoor temperatures to analyze the impact of different temperatures on sentiment.ResultsThe recognition rate of emotion classification under three different indoor temperatures showed that anger and fear had the best recognition effect among the five emotions under hot temperatures, while joy had the worst recognition effect. At a comfortable temperature, joy and calmness have the best recognition effect among the five emotions, while fear and sadness have the worst recognition effect. In cold temperatures, sadness and fear have the best recognition effect among the five emotions, while anger and joy have the worst recognition effect.DiscussionThis article uses classification to recognize emotions from physiological signals under the three temperatures mentioned above. By comparing the recognition rates of different emotions at three different temperatures, it was found that positive emotions are enhanced at comfortable temperatures, while negative emotions are enhanced at hot and cold temperatures. The experimental results indicate that there is a certain correlation between indoor temperature and physiological emotions

    Neurophysiological Assessment of Affective Experience

    Get PDF
    In the field of Affective Computing the affective experience (AX) of the user during the interaction with computers is of great interest. The automatic recognition of the affective state, or emotion, of the user is one of the big challenges. In this proposal I focus on the affect recognition via physiological and neurophysiological signals. Longā€standing evidence from psychophysiological research and more recently from research in affective neuroscience suggests that both, body and brain physiology, are able to indicate the current affective state of a subject. However, regarding the classification of AX several questions are still unanswered. The principal possibility of AX classification was repeatedly shown, but its generalisation over different task contexts, elicitating stimuli modalities, subjects or time is seldom addressed. In this proposal I will discuss a possible agenda for the further exploration of physiological and neurophysiological correlates of AX over different elicitation modalities and task contexts

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Emotion Detection Using Noninvasive Low Cost Sensors

    Full text link
    Emotion recognition from biometrics is relevant to a wide range of application domains, including healthcare. Existing approaches usually adopt multi-electrodes sensors that could be expensive or uncomfortable to be used in real-life situations. In this study, we investigate whether we can reliably recognize high vs. low emotional valence and arousal by relying on noninvasive low cost EEG, EMG, and GSR sensors. We report the results of an empirical study involving 19 subjects. We achieve state-of-the- art classification performance for both valence and arousal even in a cross-subject classification setting, which eliminates the need for individual training and tuning of classification models.Comment: To appear in Proceedings of ACII 2017, the Seventh International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, Oct. 23-26, 201

    Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

    Get PDF
    The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel peopleā€™s emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when peopleā€™s gender is taken into account, the standard deviation of the fundamental frequency of speech indicate peopleā€™s experienced emotions. As such, these measures validate each other. Moreover, it is found that peopleā€™s environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection
    • ā€¦
    corecore