Fine-Grained Emotion Recognition Using Brain-Heart Interplay Measurements and eXplainable Convolutional Neural Networks

Abstract

Emotion recognition from electro-physiological signals is an important research topic in multiple scientific domains. While a multimodal input may lead to additional information that increases emotion recognition performance, an optimal processing pipeline for such a vectorial input is yet undefined. Moreover, the algorithm performance often compromises between the ability to generalize over an emotional dimension and the explainability associated with its recognition accuracy. This study proposes a novel explainable artificial intelligence architecture for a 9-level valence recognition from electroencephalographic (EEG) and electrocardiographic (ECG) signals. Synchronous EEG-ECG information are combined to derive vectorial brain-heart interplay features, which are rearranged in a sparse matrix (image) and then classified through an explainable convolutional neural network. The proposed architecture is tested on the publicly available MAHNOB dataset also against the use of vectorial EEG input. Results, also expressed in terms of confusion matrices, outperform the current state of the art, especially in terms of recognition accuracy. In conclusion, we demonstrate the effectiveness of the proposed approach embedding multimodal brain-heart dynamics in an explainable fashion

    Similar works