23 research outputs found

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Virtual Environment for Monitoring Emotional Behaviour in Driving

    No full text

    Facial Expression Recognition Using HLAC Features and WPCA

    No full text

    Research on the Personalized Interaction Model Driven by User Behavior

    No full text

    An Analysis of Facial Description in Static Images and Video Streams

    No full text

    Emotion-based control of cooperating heterogeneous mobile robots

    No full text

    Quantifying driver frustration to improve road safety

    No full text
    corecore