Physiological signal-based emotion recognition from wearable devices

Abstract

The interest in computers recognizing human emotions has been increasing recently. Many studies have been done about recognizing emotions from physical signals such as facial expressions or from written text with good results. However, recognizing emotions from physiological signals such as heart rate, from wearable devices without physical signals have been challenging. Some studies have given good, or at least promising results. The challenge for emotion recognition is to understand how human body actually reacts to different emotional triggers and to find a common factors among people. The aim of this study is to find out whether it is possible to accurately recognize human emotions and stress from physiological signals using supervised machine learning. Further, we consider the question what type of biosignals are most informative for making such predictions. The performance of Support Vector Machines and Random Forest classifiers are experimentally evaluated on the task of separating stress and no-stress signals from three different biosignals: ECG, PPG and EDA. The challenges with these biosginals from acquiring them to pre-processing the signals are addressed and their connection to emotional experience is discussed. In addition, the challenges and problems on experimental setups used in previous studies are addressed and especially the usability problems of the dataset. The models implemented in this thesis were not able to accurately classify emotions using supervised machine learning from the dataset used. The models did not perform remarkably better than just randomly choosing labels. PPG signal however performed slightly better than ECG or EDA for stress detection

    Similar works