3 research outputs found
Brain Computer Interface for Gesture Control of a Social Robot: an Offline Study
Brain computer interface (BCI) provides promising applications in
neuroprosthesis and neurorehabilitation by controlling computers and robotic
devices based on the patient's intentions. Here, we have developed a novel BCI
platform that controls a personalized social robot using noninvasively acquired
brain signals. Scalp electroencephalogram (EEG) signals are collected from a
user in real-time during tasks of imaginary movements. The imagined body
kinematics are decoded using a regression model to calculate the user-intended
velocity. Then, the decoded kinematic information is mapped to control the
gestures of a social robot. The platform here may be utilized as a
human-robot-interaction framework by combining with neurofeedback mechanisms to
enhance the cognitive capability of persons with dementia.Comment: Presented in: 25th Iranian Conference on Electrical Engineering
(ICEE
Upper Limb Movement Recognition utilising EEG and EMG Signals for Rehabilitative Robotics
Upper limb movement classification, which maps input signals to the target
activities, is a key building block in the control of rehabilitative robotics.
Classifiers are trained for the rehabilitative system to comprehend the desires
of the patient whose upper limbs do not function properly. Electromyography
(EMG) signals and Electroencephalography (EEG) signals are used widely for
upper limb movement classification. By analysing the classification results of
the real-time EEG and EMG signals, the system can understand the intention of
the user and predict the events that one would like to carry out. Accordingly,
it will provide external help to the user. However, the noise in the real-time
EEG and EMG data collection process contaminates the effectiveness of the data,
which undermines classification performance. Moreover, not all patients process
strong EMG signals due to muscle damage and neuromuscular disorder. To address
these issues, this paper explores different feature extraction techniques and
machine learning and deep learning models for EEG and EMG signals
classification and proposes a novel decision-level multisensor fusion technique
to integrate EEG signals with EMG signals. This system retrieves effective
information from both sources to understand and predict the desire of the user,
and thus aid. By testing out the proposed technique on a publicly available
WAY-EEG-GAL dataset, which contains EEG and EMG signals that were recorded
simultaneously, we manage to conclude the feasibility and effectiveness of the
novel system.Comment: 20 pages, 11 figures, 2 tables; Thesis for Undergraduate Research
Project in Computing, NUS; Accepted by Future of Information and
Communication Conference 2023, San Francisc