137 research outputs found

    Algorithms for Neural Prosthetic Applications

    Get PDF
    abstract: In the last 15 years, there has been a significant increase in the number of motor neural prostheses used for restoring limb function lost due to neurological disorders or accidents. The aim of this technology is to enable patients to control a motor prosthesis using their residual neural pathways (central or peripheral). Recent studies in non-human primates and humans have shown the possibility of controlling a prosthesis for accomplishing varied tasks such as self-feeding, typing, reaching, grasping, and performing fine dexterous movements. A neural decoding system comprises mainly of three components: (i) sensors to record neural signals, (ii) an algorithm to map neural recordings to upper limb kinematics and (iii) a prosthetic arm actuated by control signals generated by the algorithm. Machine learning algorithms that map input neural activity to the output kinematics (like finger trajectory) form the core of the neural decoding system. The choice of the algorithm is thus, mainly imposed by the neural signal of interest and the output parameter being decoded. The various parts of a neural decoding system are neural data, feature extraction, feature selection, and machine learning algorithm. There have been significant advances in the field of neural prosthetic applications. But there are challenges for translating a neural prosthesis from a laboratory setting to a clinical environment. To achieve a fully functional prosthetic device with maximum user compliance and acceptance, these factors need to be addressed and taken into consideration. Three challenges in developing robust neural decoding systems were addressed by exploring neural variability in the peripheral nervous system for dexterous finger movements, feature selection methods based on clinically relevant metrics and a novel method for decoding dexterous finger movements based on ensemble methods.Dissertation/ThesisDoctoral Dissertation Bioengineering 201

    Learning and adaptation in brain machine interfaces

    Get PDF
    Balancing subject learning and decoder adaptation is central to increasing brain machine interface (BMI) performance. We addressed these complementary aspects in two studies: (1) a learning study, in which mice modulated “beta” band activity to control a 1D auditory cursor, and (2) an adaptive decoding study, in which a simple recurrent artificial neural network (RNN) decoded intended saccade targets of monkeys. In the learning study, three mice successfully increased beta band power following trial initiations, and specifically increased beta burst durations from 157 ms to 182 ms, likely contributing to performance. Though the task did not explicitly require specific movements, all three mice appeared to modulate beta activity via active motor control and had consistent vibrissal motor cortex multiunit activity and local field potential relationships with contralateral whisker pad electromyograms. The increased burst durations may therefore by a direct result of increased motor activity. These findings suggest that only a subset of beta rhythm phenomenology can be volitionally modulated (e.g. the tonic “hold” beta), therefore limiting the possible set of successful beta neuromodulation strategies. In the adaptive decoding study, RNNs decoded delay period activity in oculomotor and working memory regions while monkeys performed a delayed saccade task. Adaptive decoding sessions began with brain-controlled trials using pre-trained RNN models, in contrast to static decoding sessions in which 300-500 initial eye-controlled training trials were performed. Closed loop RNN decoding performance was lower than predicted by offline simulations. More consistent delay period activity and saccade paths across trials were associated with higher decoding performance. Despite the advantage of consistency, one monkey’s delay period activity patterns changed over the first week of adaptive decoding, and the other monkey’s saccades were more erratic during adaptive decoding than during static decoding sessions. It is possible that the altered session paradigm eliminating eye-controlled training trials led to either frustration or exploratory learning, causing the neural and behavioral changes. Considering neural control and decoder adaptation of BMIs in these studies, future work should improve the “two-learner” subject-decoder system by better modeling the interaction between underlying brain states (and possibly their modulation) and the neural signatures representing desired outcomes

    Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    Get PDF
    This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development

    Hand Shape Representations in the Human Posterior Parietal Cortex

    Get PDF
    Humans shape their hands to grasp, manipulate objects, and to communicate. From nonhuman primate studies, we know that visual and motor properties for grasps can be derived from cells in the posterior parietal cortex (PPC). Are non-grasp-related hand shapes in humans represented similarly? Here we show for the first time how single neurons in the PPC of humans are selective for particular imagined hand shapes independent of graspable objects. We find that motor imagery to shape the hand can be successfully decoded from the PPC by implementing a version of the popular Rock-Paper-Scissors game and its extension Rock-Paper-Scissors-Lizard-Spock. By simultaneous presentation of visual and auditory cues, we can discriminate motor imagery from visual information and show differences in auditory and visual information processing in the PPC. These results also demonstrate that neural signals from human PPC can be used to drive a dexterous cortical neuroprosthesis
    corecore