14 research outputs found

    Towards emotional interaction: using movies to automatically learn users’ emotional states

    Get PDF
    The HCI community is actively seeking novel methodologies to gain insight into the user's experience during interaction with both the application and the content. We propose an emotional recognition engine capable of automatically recognizing a set of human emotional states using psychophysiological measures of the autonomous nervous system, including galvanic skin response, respiration, and heart rate. A novel pattern recognition system, based on discriminant analysis and support vector machine classifiers is trained using movies' scenes selected to induce emotions ranging from the positive to the negative valence dimension, including happiness, anger, disgust, sadness, and fear. In this paper we introduce an emotion recognition system and evaluate its accuracy by presenting the results of an experiment conducted with three physiologic sensors.info:eu-repo/semantics/publishedVersio

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Design of an Emotion Elicitation Framework for Arabic Speakers

    No full text
    The automatic detection of human affective states has been of great interest lately for its applications not only in the field of Human-Computer Interaction, but also for its applications in physiological, neurobiological and sociological studies. Several standardized techniques to elicit emotions have been used, with emotion eliciting movie clips being the most popular. To date, there are only four studies that have been carried out to validate emotional movie clips using three different languages (English, French, Spanish) and cultures (French, Italian, British / American). The context of language and culture is an underexplored area in affective computing. Considering cultural and language differences between Western and Arab countries, it is possible that some of the validated clips, even when dubbed, will not achieve similar results. Given the unique and conservative cultures of the Arab countries, a standardized and validated framework for affect studies is needed in order to be comparable with current studies of different cultures and languages. In this paper, we describe a framework and its prerequisites for eliciting emotions that could be used for affect studies on an Arab population. We present some aspects of Arab culture values that might affect the selection and acceptance of emotion eliciting video clips. Methods for rating and validating Arab emotional clips are presented to derive at a list of clips that could be used in the proposed emotion elicitation framework. A pilot study was conducted to evaluate a basic version of our framework, which showed great potential to succeed in eliciting emotions

    Emotion in Motion: A Study of Music and Affective Response

    Get PDF
    Emotion in Motion’ is an experiment designed to understand the emotional reaction of people to a variety of musical excerpts, via self-report questionnaires and the recording of electrodermal response (EDR) and pulse oximetry (HR) signals. The experiment ran for 3 months as part of a public exhibition, having nearly 4000 participants and over 12000 listening samples. This paper presents the methodology used by the authors to approach this research, as well as preliminary results derived from the self-report data and the physiology
    corecore