Motor kinematics decoding (MKD) using brain signal is essential to develop
Brain-computer interface (BCI) system for rehabilitation or prosthesis devices.
Surface electroencephalogram (EEG) signal has been widely utilized for MKD.
However, kinematic decoding from cortical sources is sparsely explored. In this
work, the feasibility of hand kinematics decoding using EEG cortical source
signals has been explored for grasp and lift task. In particular, pre-movement
EEG segment is utilized. A residual convolutional neural network (CNN) - long
short-term memory (LSTM) based kinematics decoding model is proposed that
utilizes motor neural information present in pre-movement brain activity.
Various EEG windows at 50 ms prior to movement onset, are utilized for hand
kinematics decoding. Correlation value (CV) between actual and predicted hand
kinematics is utilized as performance metric for source and sensor domain. The
performance of the proposed deep learning model is compared in sensor and
source domain. The results demonstrate the viability of hand kinematics
decoding using pre-movement EEG cortical source data