4 research outputs found

    Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study

    Get PDF
    Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand

    U-Limb: A multi-modal, multi-center database on arm motion control in healthy and post-stroke conditions

    Get PDF
    BACKGROUND: Shedding light on the neuroscientific mechanisms of human upper limb motor control, in both healthy and disease conditions (e.g., after a stroke), can help to devise effective tools for a quantitative evaluation of the impaired conditions, and to properly inform the rehabilitative process. Furthermore, the design and control of mechatronic devices can also benefit from such neuroscientific outcomes, with important implications for assistive and rehabilitation robotics and advanced human-machine interaction. To reach these goals, we believe that an exhaustive data collection on human behavior is a mandatory step. For this reason, we release U-Limb, a large, multi-modal, multi-center data collection on human upper limb movements, with the aim of fostering trans-disciplinary cross-fertilization. CONTRIBUTION: This collection of signals consists of data from 91 able-bodied and 65 post-stroke participants and is organized at 3 levels: (i) upper limb daily living activities, during which kinematic and physiological signals (electromyography, electro-encephalography, and electrocardiography) were recorded; (ii) force-kinematic behavior during precise manipulation tasks with a haptic device; and (iii) brain activity during hand control using functional magnetic resonance imaging

    Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study

    No full text
    Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand
    corecore