178 research outputs found

    Incorporating Feedback from Multiple Sensory Modalities Enhances Brain–Machine Interface Control

    Get PDF
    The brain typically uses a rich supply of feedback from multiple sensory modalities to control movement in healthy individuals. In many individuals, these afferent pathways, as well as their efferent counterparts, are compromised by disease or injury resulting in significant impairments and reduced quality of life. Brain–machine interfaces (BMIs) offer the promise of recovered functionality to these individuals by allowing them to control a device using their thoughts. Most current BMI implementations use visual feedback for closed-loop control; however, it has been suggested that the inclusion of additional feedback modalities may lead to improvements in control. We demonstrate for the first time that kinesthetic feedback can be used together with vision to significantly improve control of a cursor driven by neural activity of the primary motor cortex (MI). Using an exoskeletal robot, the monkey\u27s arm was moved to passively follow a cortically controlled visual cursor, thereby providing the monkey with kinesthetic information about the motion of the cursor. When visual and proprioceptive feedback were congruent, both the time to successfully reach a target decreased and the cursor paths became straighter, compared with incongruent feedback conditions. This enhanced performance was accompanied by a significant increase in the amount of movement-related information contained in the spiking activity of neurons in MI. These findings suggest that BMI control can be significantly improved in paralyzed patients with residual kinesthetic sense and provide the groundwork for augmenting cortically controlled BMIs with multiple forms of natural or surrogate sensory feedback

    Learning and adaptation in brain machine interfaces

    Get PDF
    Balancing subject learning and decoder adaptation is central to increasing brain machine interface (BMI) performance. We addressed these complementary aspects in two studies: (1) a learning study, in which mice modulated “beta” band activity to control a 1D auditory cursor, and (2) an adaptive decoding study, in which a simple recurrent artificial neural network (RNN) decoded intended saccade targets of monkeys. In the learning study, three mice successfully increased beta band power following trial initiations, and specifically increased beta burst durations from 157 ms to 182 ms, likely contributing to performance. Though the task did not explicitly require specific movements, all three mice appeared to modulate beta activity via active motor control and had consistent vibrissal motor cortex multiunit activity and local field potential relationships with contralateral whisker pad electromyograms. The increased burst durations may therefore by a direct result of increased motor activity. These findings suggest that only a subset of beta rhythm phenomenology can be volitionally modulated (e.g. the tonic “hold” beta), therefore limiting the possible set of successful beta neuromodulation strategies. In the adaptive decoding study, RNNs decoded delay period activity in oculomotor and working memory regions while monkeys performed a delayed saccade task. Adaptive decoding sessions began with brain-controlled trials using pre-trained RNN models, in contrast to static decoding sessions in which 300-500 initial eye-controlled training trials were performed. Closed loop RNN decoding performance was lower than predicted by offline simulations. More consistent delay period activity and saccade paths across trials were associated with higher decoding performance. Despite the advantage of consistency, one monkey’s delay period activity patterns changed over the first week of adaptive decoding, and the other monkey’s saccades were more erratic during adaptive decoding than during static decoding sessions. It is possible that the altered session paradigm eliminating eye-controlled training trials led to either frustration or exploratory learning, causing the neural and behavioral changes. Considering neural control and decoder adaptation of BMIs in these studies, future work should improve the “two-learner” subject-decoder system by better modeling the interaction between underlying brain states (and possibly their modulation) and the neural signatures representing desired outcomes

    Algorithms for Neural Prosthetic Applications

    Get PDF
    abstract: In the last 15 years, there has been a significant increase in the number of motor neural prostheses used for restoring limb function lost due to neurological disorders or accidents. The aim of this technology is to enable patients to control a motor prosthesis using their residual neural pathways (central or peripheral). Recent studies in non-human primates and humans have shown the possibility of controlling a prosthesis for accomplishing varied tasks such as self-feeding, typing, reaching, grasping, and performing fine dexterous movements. A neural decoding system comprises mainly of three components: (i) sensors to record neural signals, (ii) an algorithm to map neural recordings to upper limb kinematics and (iii) a prosthetic arm actuated by control signals generated by the algorithm. Machine learning algorithms that map input neural activity to the output kinematics (like finger trajectory) form the core of the neural decoding system. The choice of the algorithm is thus, mainly imposed by the neural signal of interest and the output parameter being decoded. The various parts of a neural decoding system are neural data, feature extraction, feature selection, and machine learning algorithm. There have been significant advances in the field of neural prosthetic applications. But there are challenges for translating a neural prosthesis from a laboratory setting to a clinical environment. To achieve a fully functional prosthetic device with maximum user compliance and acceptance, these factors need to be addressed and taken into consideration. Three challenges in developing robust neural decoding systems were addressed by exploring neural variability in the peripheral nervous system for dexterous finger movements, feature selection methods based on clinically relevant metrics and a novel method for decoding dexterous finger movements based on ensemble methods.Dissertation/ThesisDoctoral Dissertation Bioengineering 201

    Successfully Controlled BCI Through Minimal Dry Electrodes

    Get PDF
    ABSTRACT: There are approximately 185,000 amputations a year in the United States according to the Amputee Coalition with the number of amputations going up. While it is common for someone with a lower limb amputation to use a prosthetic, approximately 84%, it is not as common for people with upper limb amputations, approximately 56% (Raichle et al., 2008). The time it takes an amputee to get a prosthetic affects the likelihood of use, in addition to functionality (Miller et al., 2020). The purpose of this project is to show proof of concept of an EEG-controlled prosthetic, using only 2 dry-electrodes, through the use of BCI2000 using imagined movements. Eight (N-8) participants were recruited to complete a pre-training mu task, a 1D cursor training task, a 2D cursor training task, and the main 2D cursor task. After a frequency was established for each participant, they completed 200 trials of the 1D cursor task for three different conditions (left, right, and both hand(s)) or reached a success rate of 80% for 4 trials in a row with random targets. The participants then completed the 2D cursor task with random targets until a success rate of 70% for 4 trials in a row was achieved, followed by a 2D cursor task where the targets were pre-determined. A chi-squared test determined the goodness of fit for the success rate was significant (p < 0.001) for all participants completing the 1D cursor task. The combined success rate for the participants during task 1 for their right hand was 30.16%, 47.11% for their left hand, and 61.47% for both hands. The combined success rate for task 2 was 69.40% and 79.59% for the main task. Overall, this study successfully showed that 2 dry electrodes can be used to detect imagined movements through BCI. While the accuracy can still be improved, by enhancing the equipment and developing the training protocol, both participants that completed the main task were able to surpass the expected overall accuracy and surpass 4 out of the 6 individual accuracies. Whether it is to control a mechanical arm, leg, or other body part, the framework of this study grants development opportunities for BCI from a few dry electrodes

    A theory of sensorimotor learning for brain-machine interface control

    Get PDF
    A remarkable demonstration of the flexibility of mammalian motor systems is primates’ ability to learn to control brain-machine interfaces (BMI’s). This constitutes a completely novel and artificial form of motor behavior, yet primates are capable of learning to control BMI’s under a wide range of conditions. BMI’s with carefully calibrated decoders, for example, can be learned with only minutes to hours of practice. With a few weeks of practice, even BMI’s with random decoders can be learned. What are the biological substrates of this learning process? This thesis proposes a simple theory of the computational principles underlying BMI learning. Through comprehensive numerical and formal analysis, we demonstrate that this theory can provide a unifying explanation for various disparate phenomena observed during BMI learning in three different BMI learning tasks. By explicitly modeling the underlying neural circuitry, the theory reveals an interpretation of these phenomena in terms of the biological non-linear dynamics of neural circuits

    Toward a Full Prehension Decoding from Dorsomedial Area V6A

    Get PDF
    Neural prosthetics represent a promising approach to restore movements in patients affected by spinal cord lesions. To drive a full capable, brain controlled, prosthetic arm, reaching and grasping components of prehension have to be accurately reconstructed from neural activity. Neurons in the dorsomedial area V6A of macaque show sensitivity to reaching direction accounting also for depth dimension, thus encoding positions in the entire 3D space. Moreover, many neurons are sensible to grips types and wrist orientations. To assess whether these signals are adequate to drive a full capable neural prosthetic arm, we recorded spiking activity of neurons in area V6A, spike counts were used to train machine learning algorithms to reconstruct reaching and grasping. In a first work, two Macaca fascicularis monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. The activity of 89 neurons was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well above chance level for all the epochs analyzed in this study. In a second work, monkeys were trained to perform reaches to targets located at various depths and directions and the classifier was tested whether it could correctly predict the reach goal position from V6A signals. The reach goal location was reliably decoded with accuracy close to optimal (>90%) throughout the task. Together these results, show a reliable decoding of hand grips and spatial location of reaching goals in the same area, suggesting that V6A is a suitable site to decode the entire prehension action with obvious advantages in terms of implant invasiveness. This new PPC site useful for decoding both reaching and grasping opens new perspectives in the development of human brain-computer interfaces

    Heterogeneous recognition of bioacoustic signals for human-machine interfaces

    No full text
    Human-machine interfaces (HMI) provide a communication pathway between man and machine. Not only do they augment existing pathways, they can substitute or even bypass these pathways where functional motor loss prevents the use of standard interfaces. This is especially important for individuals who rely on assistive technology in their everyday life. By utilising bioacoustic activity, it can lead to an assistive HMI concept which is unobtrusive, minimally disruptive and cosmetically appealing to the user. However, due to the complexity of the signals it remains relatively underexplored in the HMI field. This thesis investigates extracting and decoding volition from bioacoustic activity with the aim of generating real-time commands. The developed framework is a systemisation of various processing blocks enabling the mapping of continuous signals into M discrete classes. Class independent extraction efficiently detects and segments the continuous signals while class-specific extraction exemplifies each pattern set using a novel template creation process stable to permutations of the data set. These templates are utilised by a generalised single channel discrimination model, whereby each signal is template aligned prior to classification. The real-time decoding subsystem uses a multichannel heterogeneous ensemble architecture which fuses the output from a diverse set of these individual discrimination models. This enhances the classification performance by elevating both the sensitivity and specificity, with the increased specificity due to a natural rejection capacity based on a non-parametric majority vote. Such a strategy is useful when analysing signals which have diverse characteristics, false positives are prevalent and have strong consequences, and when there is limited training data available. The framework has been developed with generality in mind with wide applicability to a broad spectrum of biosignals. The processing system has been demonstrated on real-time decoding of tongue-movement ear pressure signals using both single and dual channel setups. This has included in-depth evaluation of these methods in both offline and online scenarios. During online evaluation, a stimulus based test methodology was devised, while representative interference was used to contaminate the decoding process in a relevant and real fashion. The results of this research provide a strong case for the utility of such techniques in real world applications of human-machine communication using impulsive bioacoustic signals and biosignals in general
    • …
    corecore