research

Decoding Complex Imagery Hand Gestures

Abstract

Brain computer interfaces (BCIs) offer individuals suffering from major disabilities an alternative method to interact with their environment. Sensorimotor rhythm (SMRs) based BCIs can successfully perform control tasks; however, the traditional SMR paradigms intuitively disconnect the control and real task, making them non-ideal for complex control scenarios. In this study, we design a new, intuitively connected motor imagery (MI) paradigm using hierarchical common spatial patterns (HCSP) and context information to effectively predict intended hand grasps from electroencephalogram (EEG) data. Experiments with 5 participants yielded an aggregate classification accuracy--intended grasp prediction probability--of 64.5\% for 8 different hand gestures, more than 5 times the chance level.Comment: This work has been submitted to EMBC 201

    Similar works

    Full text

    thumbnail-image

    Available Versions