Paving the Way for Motor Imagery-Based Tele-Rehabilitation through a Fully Wearable BCI System

Abstract

The present study introduces a brain–computer interface designed and prototyped to be wearable and usable in daily life. Eight dry electroencephalographic sensors were adopted to acquire the brain activity associated with motor imagery. Multimodal feedback in extended reality was exploited to improve the online detection of neurological phenomena. Twenty-seven healthy subjects used the proposed system in five sessions to investigate the effects of feedback on motor imagery. The sample was divided into two equal-sized groups: a “neurofeedback” group, which performed motor imagery while receiving feedback, and a “control” group, which performed motor imagery with no feedback. Questionnaires were administered to participants aiming to investigate the usability of the proposed system and an individual’s ability to imagine movements. The highest mean classification accuracy across the subjects of the control group was about 62% with 3% associated type A uncertainty, and it was 69% with 3% uncertainty for the neurofeedback group. Moreover, the results in some cases were significantly higher for the neurofeedback group. The perceived usability by all participants was high. Overall, the study aimed at highlighting the advantages and the pitfalls of using a wearable brain–computer interface with dry sensors. Notably, this technology can be adopted for safe and economically viable tele-rehabilitation

    Similar works