1 research outputs found

    Wireless brain-computer interface for wheelchair control by using fast machine learning and real-time hyper-dimensional classification

    No full text
    This paper presents a noninvasive brain-controlled P300-based wheelchair driven by EEG signals to be used by tetraplegic and paralytic users. The P300 - an Evoked Related Potential (ERP) - is induced for purpose by visual stimuli. The developed Brain-Computer Interface is made up by: (i) acquisition unit; (ii) processing unit and (iii) navigation unit. The acquisition unit is a wireless 32-channel EEG headset collecting data from 6 electrodes (parietal-cortex area). The processing unit is a dedicated µPC performing stimuli delivery, data gathering, Machine Learning (ML), real-time hyper-dimensional classification leading to the user intention interpretation. The ML stage is based on a custom algorithm (t-RIDE) which trains the following classification stage on the user-tuned P300 reference features. The real-time classification performs a functional approach for time-domain features extraction, which reduce the amount of data to be analyzed. The Raspberry-based navigation unit actuates the received commands and support the wheelchair motion using peripheral sensors (USB camera for video processing, ultrasound sensors). Differently from related works, the proposed protocol for stimulation is aware of the environment. The experimental results, based on a dataset of 5 subjects, demonstrate that: (i) the implemented ML algorithm allows a complete P300 spatio-temporal characterization in 1.95 s using only 22 target brain visual stimuli (88 s/direction); (ii) the complete classification chain (from features extraction to validation) takes in the worst case only 19.65 ms ± 10.1, allowing real-time control; (iii) the classification accuracy of the implemented BCI is 80.5 ± 4.1% on single-trial
    corecore