449 research outputs found

    Emotion Detection Using Noninvasive Low Cost Sensors

    Full text link
    Emotion recognition from biometrics is relevant to a wide range of application domains, including healthcare. Existing approaches usually adopt multi-electrodes sensors that could be expensive or uncomfortable to be used in real-life situations. In this study, we investigate whether we can reliably recognize high vs. low emotional valence and arousal by relying on noninvasive low cost EEG, EMG, and GSR sensors. We report the results of an empirical study involving 19 subjects. We achieve state-of-the- art classification performance for both valence and arousal even in a cross-subject classification setting, which eliminates the need for individual training and tuning of classification models.Comment: To appear in Proceedings of ACII 2017, the Seventh International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, Oct. 23-26, 201

    Prerequisites for Affective Signal Processing (ASP)

    Get PDF
    Although emotions are embraced by science, their recognition has not reached a satisfying level. Through a concise overview of affect, its signals, features, and classification methods, we provide understanding for the problems encountered. Next, we identify the prerequisites for successful Affective Signal Processing: validation (e.g., mapping of constructs on signals), triangulation, a physiology-driven approach, and contributions of the signal processing community. Using these directives, a critical analysis of a real-world case is provided. This illustrates that the prerequisites can become a valuable guide for Affective Signal Processing (ASP)

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Linking recorded data with emotive and adaptive computing in an eHealth environment

    Get PDF
    Telecare, and particularly lifestyle monitoring, currently relies on the ability to detect and respond to changes in individual behaviour using data derived from sensors around the home. This means that a significant aspect of behaviour, that of an individuals emotional state, is not accounted for in reaching a conclusion as to the form of response required. The linked concepts of emotive and adaptive computing offer an opportunity to include information about emotional state and the paper considers how current developments in this area have the potential to be integrated within telecare and other areas of eHealth. In doing so, it looks at the development of and current state of the art of both emotive and adaptive computing, including its conceptual background, and places them into an overall eHealth context for application and development

    CLASSIFYING EMOTION USING STREAMING OF PHYSIOLOGICAL CORRELATES OF EMOTION

    Get PDF
    The ability for a computer to recognize emotions would have many uses. In the field of human-computer interaction, it would be useful if computers could sense if a user is frustrated and offer help (Lisetti & Nasoz, 2002), or it could be used in cars to predict stress or road rage (Nasoz, Lisetti, & Vasilakos, 2010). Also, it has uses in the medical field with emotional therapy or monitoring patients (Rebenitsch, Owen, Brohil, Biocca, & Ferydiansyah, 2010). Emotion recognition is a complex subject that combines psychology and computer science, but it is not a new problem. When the question was first posed, researchers examined at physiological signals that could help differentiate an emotion (Schachter & Singer, 1962). As the research progressed, researchers examined ways in which computers could recognize emotions, many of which were successful. Previous research has not yet looked at the emotional data as streaming data, or attempted to classify emotion in real time. This thesis extracts features from a window of simulated streaming data to attempt to classify emotions in real time. As a corollary, this method can also be used to attempt to identify the earliest point an emotion can be predicted. The results show that emotions can be classified in real time, and applying a window and feature extraction leads to better classification success. It shows that this method may be used to determine if an emotion could be predicted before it is cognitively experienced, but it could not predict the emotion transitional state. More research is required before that goal can be achieved
    corecore