327 research outputs found

    Decoding Complex Imagery Hand Gestures

    Full text link
    Brain computer interfaces (BCIs) offer individuals suffering from major disabilities an alternative method to interact with their environment. Sensorimotor rhythm (SMRs) based BCIs can successfully perform control tasks; however, the traditional SMR paradigms intuitively disconnect the control and real task, making them non-ideal for complex control scenarios. In this study, we design a new, intuitively connected motor imagery (MI) paradigm using hierarchical common spatial patterns (HCSP) and context information to effectively predict intended hand grasps from electroencephalogram (EEG) data. Experiments with 5 participants yielded an aggregate classification accuracy--intended grasp prediction probability--of 64.5\% for 8 different hand gestures, more than 5 times the chance level.Comment: This work has been submitted to EMBC 201

    Unimanual versus bimanual motor imagery classifiers for assistive and rehabilitative brain computer interfaces

    Get PDF
    Bimanual movements are an integral part of everyday activities and are often included in rehabilitation therapies. Yet electroencephalography (EEG) based assistive and rehabilitative brain computer interface (BCI) systems typically rely on motor imagination (MI) of one limb at the time. In this study we present a classifier which discriminates between uni-and bimanual MI. Ten able bodied participants took part in cue based motor execution (ME) and MI tasks of the left (L), right (R) and both (B) hands. A 32 channel EEG was recorded. Three linear discriminant analysis classifiers, based on MI of L-B, B-R and B--L hands were created, with features based on wide band Common Spatial Patterns (CSP) 8-30 Hz, and band specifics Common Spatial Patterns (CSPb). Event related desynchronization (ERD) was significantly stronger during bimanual compared to unimanual ME on both hemispheres. Bimanual MI resulted in bilateral parietally shifted ERD of similar intensity to unimanual MI. The average classification accuracy for CSP and CSPb was comparable for L-R task (73±9% and 75±10% respectively) and for L-B task (73±11% and 70±9% respectively). However, for R-B task (67±3% and 72±6% respectively) it was significantly higher for CSPb (p=0.0351). Six participants whose L-R classification accuracy exceeded 70% were included in an on-line task a week later, using the unmodified offline CSPb classifier, achieving 69±3% and 66±3% accuracy for the L-R and R-B tasks respectively. Combined uni and bimanual BCI could be used for restoration of motor function of highly disabled patents and for motor rehabilitation of patients with motor deficits

    EEG Classification of Different Imaginary Movements within the Same Limb

    Get PDF
    The task of discriminating the motor imagery of different movements within the same limb using electroencephalography (EEG) signals is challenging because these imaginary movements have close spatial representations on the motor cortex area. There is, however, a pressing need to succeed in this task. The reason is that the ability to classify different same-limb imaginary movements could increase the number of control dimensions of a brain-computer interface (BCI). In this paper, we propose a 3-class BCI system that discriminates EEG signals corresponding to rest, imaginary grasp movements, and imaginary elbow movements. Besides, the differences between simple motor imagery and goal-oriented motor imagery in terms of their topographical distributions and classification accuracies are also being investigated. To the best of our knowledge, both problems have not been explored in the literature. Based on the EEG data recorded from 12 able-bodied individuals, we have demonstrated that same-limb motor imagery classification is possible. For the binary classification of imaginary grasp and elbow (goal-oriented) movements, the average accuracy achieved is 66.9%. For the 3-class problem of discriminating rest against imaginary grasp and elbow movements, the average classification accuracy achieved is 60.7%, which is greater than the random classification accuracy of 33.3%. Our results also show that goal-oriented imaginary elbow movements lead to a better classification performance compared to simple imaginary elbow movements. This proposed BCI system could potentially be used in controlling a robotic rehabilitation system, which can assist stroke patients in performing task-specific exercises

    Single-trial EEG Discrimination between Wrist and Finger Movement Imagery and Execution in a Sensorimotor BCI

    Full text link
    A brain-computer interface (BCI) may be used to control a prosthetic or orthotic hand using neural activity from the brain. The core of this sensorimotor BCI lies in the interpretation of the neural information extracted from electroencephalogram (EEG). It is desired to improve on the interpretation of EEG to allow people with neuromuscular disorders to perform daily activities. This paper investigates the possibility of discriminating between the EEG associated with wrist and finger movements. The EEG was recorded from test subjects as they executed and imagined five essential hand movements using both hands. Independent component analysis (ICA) and time-frequency techniques were used to extract spectral features based on event-related (de)synchronisation (ERD/ERS), while the Bhattacharyya distance (BD) was used for feature reduction. Mahalanobis distance (MD) clustering and artificial neural networks (ANN) were used as classifiers and obtained average accuracies of 65 % and 71 % respectively. This shows that EEG discrimination between wrist and finger movements is possible. The research introduces a new combination of motor tasks to BCI research.Comment: 33rd Annual International IEEE EMBS Conference 201

    Investigation of differences in cortical activation during wrist flexion and extension performed under real, passive and motor imagined paradigms

    Get PDF
    The neuromuscular control comparison between flexion and extension of the upper extremities has been conducted in a number of studies. It has been speculated that differences in the corticospinal pathway between flexion and extension may play a role in the cortical difference detected between flexion and extension, resulting in higher cortical activation for extension. However, it is still unclear as to what roles these pathways play, and to what degree other factors (muscle force activation, sensory feedback, frequency of movement, structural and/or functional differences) might influence the cortical activation in the brain. It has been speculated that the difference in cortical muscular pathways is due to flexion movements being used more often in day to day activities, therefore requiring less cortical activation for that movement. Through the investigation of the cortical differences present during different movement types, a deeper understanding into the differences between flexion and extension may be obtained. No previous study has compared the cortical differences between flexion and extension of the upper extremities during different movement types. In this study, an offline investigation is conducted between wrist flexion and extension, during real, passive and motor imaginary movement with the help of a servo controlled hand device. Simultaneous recording of EEG, EMG and wrist dynamics (velocity, angle, strain) were made on fifteen healthy right handed subjects performing 60 randomized repartitions of right wrist flexion and extension, for kinaesthetic motor imaginary, passively moved, and voluntary real active movements. Real movements were conducted at 10% relative subject maximum voluntary contraction (MVC). A servo controlled hand device was used to regulate dynamic force applied for real movements, and provide motion during passive movements. The use of different movement types with the aid of a servo controlled hand device, may give a deeper understanding into the effects of muscle force activation, rate of movement and corticospinal pathway on flexion and extension. In order to investigate the cortical differences between flexion and extension, subjects perceived difficulty, movement dynamics, movement related cortical potential (MRCP), event related desynchronization and synchronization (ERD/ERS), and phase locking value (PLV) were measured. Each measurement examines a different aspect of the cortical activation present in the brain, during the different movement types. Although relative muscle force activation between wrist real flexion and extension was similar, the motor cortex activation during extension was higher than during flexion, by MRCP and mu-band ERD, with subjects also perceiving real wrist extension to be more difficult to perform. Passive movements found higher motor cortex activation for flexion (MRCP, beta-band ERD), however higher somatosensory cortical activation was present during extension, by mu-band ERS and PLV. Motor imagined wrist flexion showed higher cortical activation during wrist flexion, by MRCP and beta-band ERD. Although numerous variables were tested (each in difference frequency bands), with some being significant and others being non-significant, overall it can be suggested that there was higher cortical activation for extension. The higher cortical activation during wrist extension movements may be due to corticospinal and somatosensory motor control pathways to motor neuron and from sensory neuron pools for extensor/flexor muscle and muscle spindle of the upper extremities. This investigation contributes to the current literature relating to cortical differences between flexion and extension of the upper extremities, by including the real, passive and motor imaginary differences between flexion and extension

    Classification of different reaching movements from the same limb using EEG

    Get PDF
    Objective. Brain–computer-interfaces (BCIs) have been proposed not only as assistive technologies but also as rehabilitation tools for lost functions. However, due to the stochastic nature, poor spatial resolution and signal to noise ratio from electroencephalography (EEG), multidimensional decoding has been the main obstacle to implement non-invasive BCIs in real-live rehabilitation scenarios. This study explores the classification of several functional reaching movements from the same limb using EEG oscillations in order to create a more versatile BCI for rehabilitation. Approach. Nine healthy participants performed four 3D center-out reaching tasks in four different sessions while wearing a passive robotic exoskeleton at their right upper limb. Kinematics data were acquired from the robotic exoskeleton. Multiclass extensions of Filter Bank Common Spatial Patterns (FBCSP) and a linear discriminant analysis (LDA) classifier were used to classify the EEG activity into four forward reaching movements (from a starting position towards four target positions), a backward movement (from any of the targets to the starting position and rest). Recalibrating the classifier using data from previous or the same session was also investigated and compared. Main results. Average EEG decoding accuracy were significantly above chance with 67%, 62.75%, and 50.3% when decoding three, four and six tasks from the same limb, respectively. Furthermore, classification accuracy could be increased when using data from the beginning of each session as training data to recalibrate the classifier. Significance. Our results demonstrate that classification from several functional movements performed by the same limb is possible with acceptable accuracy using EEG oscillations, especially if data from the same session are used to recalibrate the classifier. Therefore, an ecologically valid decoding could be used to control assistive or rehabilitation mutli-degrees of freedom (DoF) robotic devices using EEG data. These results have important implications towards assistive and rehabilitative neuroprostheses control in paralyzed patients.This study was funded by the Baden-Württemberg Stiftung (GRUENS), the Deutsche Forschungsgemeinschaft (DFG, Koselleck and SP-1533/2-1), Bundes Ministerium für Bildung und Forschung BMBF MOTORBIC (FKZ 13GW0053), the fortune-Program of the University of Tübingen (2422-0-0), and AMORSA (FKZ 16SV7754). A Sarasola-Sanz’s work is supported by the La Caixa-DAAD scholarship, and N IrastorzaLanda’s work by the Basque Government and IKERBASQUE, Basque Foundation for Science

    Detecting and classifying three different hand movement types through electroencephalography recordings for neurorehabilitation

    Get PDF
    Brain–computer interfaces can be used for motor substitution and recovery; therefore, detection and classification of movement intention are crucial for optimal control. In this study, palmar, lateral and pinch grasps were differentiated from the idle state and classified from single-trial EEG using only information prior to the movement onset. Fourteen healthy subjects performed the three grasps 100 times, while EEG was recorded from 25 electrodes. Temporal and spectral features were extracted from each electrode, and feature reduction was performed using sequential forward selection (SFS) and principal component analysis (PCA). The detection problem was investigated as the ability to discriminate between movement preparation and the idle state. Furthermore, all task pairs and the three movements together were classified. The best detection performance across movements (79 ± 8 %) was obtained by combining temporal and spectral features. The best movement–movement discrimination was obtained using spectral features: 76 ± 9 % (2-class) and 63 ± 10 % (3-class). For movement detection and discrimination, the performance was similar across grasp types and task pairs; SFS outperformed PCA. The results show it is feasible to detect different grasps and classify the distinct movements using only information prior to the movement onset, which may enable brain–computer interface-based neurorehabilitation of upper limb function through Hebbian learning mechanisms

    Perception and cognition of cues Used in synchronous Brain–computer interfaces Modify electroencephalographic Patterns of control Tasks

    Get PDF
    A motor imagery (MI)-based brain–computer interface (BCI) is a system that enables humans to interact with their environment by translating their brain signals into control commands for a target device. In particular, synchronous BCI systems make use of cues to trigger the motor activity of interest. So far, it has been shown that electroencephalographic (EEG) patterns before and after cue onset can reveal the user cognitive state and enhance the discrimination of MI-related control tasks. However, there has been no detailed investigation of the nature of those EEG patterns. We, therefore, propose to study the cue effects on MI-related control tasks by selecting EEG patterns that best discriminate such control tasks, and analyzing where those patterns are coming from. The study was carried out using two methods: standard and all-embracing. The standard method was based on sources (recording sites, frequency bands, and time windows), where the modulation of EEG signals due to motor activity is typically detected. The all-embracing method included a wider variety of sources, where not only motor activity is reflected. The findings of this study showed that the classification accuracy (CA) of MI-related control tasks did not depend on the type of cue in use. However, EEG patterns that best differentiated those control tasks emerged from sources well defined by the perception and cognition of the cue in use. An implication of this study is the possibility of obtaining different control commands that could be detected with the same accuracy. Since different cues trigger control tasks that yield similar CAs, and those control tasks produce EEG patterns differentiated by the cue nature, this leads to accelerate the brain–computer communication by having a wider variety of detectable control commands. This is an important issue for Neuroergonomics research because neural activity could not only be used to monitor the human mental state as is typically done, but this activity might be also employed to control the system of interest

    Towards improved EEG interpretation in a sensorimotor BCI for the control of a prosthetic or orthotic hand.

    Get PDF
    A brain computer interface (BCI), which reroutes neural signals from the brain to actuators in a prosthetic or orthotic hand, promises to aid those who suffer from hand motor impairments, such as amputees and victims of strokes and spinal cord injuries. Such individuals can greatly benefit from the return of some of the essential functionality of the hand through the renewed performance of the basic hand movements involved in daily activities. These hand movements include wrist extension, wrist flexion, finger extension, finger flexion and the tripod pinch. The core of this sensorimotor BCI solution lies in the interpretation of the neural information for the five essential hand movements extracted from EEG (electroencephalogram). It is necessary to improve on the interpretation of these EEG signals; hence this research explores the possibility of single-trial EEG discrimination for the five essential hand movements in an offline, synchronous manner. The EEG was recorded from five healthy test subjects as they performed the actual and imagined movements for both hands. The research is then divided into three investigations which respectively attempt to differentiate the EEG for: 1) right and left combinations of the different hand movements, 2) wrist and finger movements on the same hand and 3) the individual five movements on the same hand. A general method is applied to all three investigations. It utilizes independent component analysis (ICA) and time-frequency techniques to extract features based on eventrelated (de)synchronisation (ERD/ERS) and movement-related cortical potentials (MRCP). The Bhattacharyya distance is used for feature reduction and Mahalanobis distance clustering and artificial neural networks are used as classifiers. The best average accuracies of 89 %, 71 % and 57 % for the three respective investigations are obtained using ANNs and features related to ERD/ERS. Along with accuracies around 70 % for a few subjects in the five-movement differentiation investigation, these results indicated the possibility of offline, synchronous differentiation of single-trial EEG for the five essential hand movements. These hand movements can be used in part or in combination as imagined and performed motor tasks for BCIs aimed at controlling prosthetic or orthotic hands

    Characterizing Motor System to Improve Training Protocols Used in Brain-Machine Interfaces Based on Motor Imagery

    Get PDF
    Motor imagery (MI)-based brain-machine interface (BMI) is a technology under development that actively modifies users’ perception and cognition through mental tasks, so as to decode their intentions from their neural oscillations, and thereby bringing some kind of activation. So far, MI as control task in BMIs has been seen as a skill that must be acquired, but neither user conditions nor controlled learning conditions have been taken into account. As motor system is a complex mechanism trained along lifetime, and MI-based BMI attempts to decode motor intentions from neural oscillations in order to put a device into action, motor mechanisms should be considered when prototyping BMI systems. It is hypothesized that the best way to acquire MI skills is following the same rules humans obey to move around the world. On this basis, new training paradigms consisting of ecological environments, identification of control tasks according to the ecological environment, transparent mapping, and multisensory feedback are proposed in this chapter. These new MI training paradigms take advantages of previous knowledge of users and facilitate the generation of mental image due to the automatic development of sensory predictions and motor behavior patterns in the brain. Furthermore, the effectuation of MI as an actual movement would make users feel that their mental images are being executed, and the resulting sensory feedback may allow forward model readjusting the imaginary movement in course
    • …
    corecore