1 research outputs found
TSception: A Deep Learning Framework for Emotion Detection Using EEG
In this paper, we propose a deep learning framework, TSception, for emotion
detection from electroencephalogram (EEG). TSception consists of temporal and
spatial convolutional layers, which learn discriminative representations in the
time and channel domains simultaneously. The temporal learner consists of
multi-scale 1D convolutional kernels whose lengths are related to the sampling
rate of the EEG signal, which learns multiple temporal and frequency
representations. The spatial learner takes advantage of the asymmetry property
of emotion responses at the frontal brain area to learn the discriminative
representations from the left and right hemispheres of the brain. In our study,
a system is designed to study the emotional arousal in an immersive virtual
reality (VR) environment. EEG data were collected from 18 healthy subjects
using this system to evaluate the performance of the proposed deep learning
network for the classification of low and high emotional arousal states. The
proposed method is compared with SVM, EEGNet, and LSTM. TSception achieves a
high classification accuracy of 86.03%, which outperforms the prior methods
significantly (p<0.05). The code is available at
https://github.com/deepBrains/TSceptionComment: Authors information updated only. Accepted to be published in: 2020
International Joint Conference on Neural Networks (IJCNN), Glasgow, July
19--24, 2020, part of 2020 IEEE World Congress on Computational Intelligence
(IEEE WCCI 2020