375 research outputs found

    A robot hand testbed designed for enhancing embodiment and functional neurorehabilitation of body schema in subjects with upper limb impairment or loss.

    Get PDF
    Many upper limb amputees experience an incessant, post-amputation "phantom limb pain" and report that their missing limbs feel paralyzed in an uncomfortable posture. One hypothesis is that efferent commands no longer generate expected afferent signals, such as proprioceptive feedback from changes in limb configuration, and that the mismatch of motor commands and visual feedback is interpreted as pain. Non-invasive therapeutic techniques for treating phantom limb pain, such as mirror visual feedback (MVF), rely on visualizations of postural changes. Advances in neural interfaces for artificial sensory feedback now make it possible to combine MVF with a high-tech "rubber hand" illusion, in which subjects develop a sense of embodiment with a fake hand when subjected to congruent visual and somatosensory feedback. We discuss clinical benefits that could arise from the confluence of known concepts such as MVF and the rubber hand illusion, and new technologies such as neural interfaces for sensory feedback and highly sensorized robot hand testbeds, such as the "BairClaw" presented here. Our multi-articulating, anthropomorphic robot testbed can be used to study proprioceptive and tactile sensory stimuli during physical finger-object interactions. Conceived for artificial grasp, manipulation, and haptic exploration, the BairClaw could also be used for future studies on the neurorehabilitation of somatosensory disorders due to upper limb impairment or loss. A remote actuation system enables the modular control of tendon-driven hands. The artificial proprioception system enables direct measurement of joint angles and tendon tensions while temperature, vibration, and skin deformation are provided by a multimodal tactile sensor. The provision of multimodal sensory feedback that is spatiotemporally consistent with commanded actions could lead to benefits such as reduced phantom limb pain, and increased prosthesis use due to improved functionality and reduced cognitive burden

    A brain-computer interface with vibrotactile biofeedback for haptic information

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only <it>vibrotactile feedback</it>, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy.</p> <p>Methods</p> <p>A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance.</p> <p>Results and Conclusion</p> <p>Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.</p

    Decoding covert somatosensory attention by a BCI system calibrated with tactile sensation

    Get PDF
    © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Objective: We propose a novel calibration strategy to facilitate the decoding of covert somatosensory attention by exploring the oscillatory dynamics induced by tactile sensation. Methods: It was hypothesized that the similarity of the oscillatory pattern between stimulation sensation (SS, real sensation) and somatosensory attentional orientation (SAO) provides a way to decode covert somatic attention. Subjects were instructed to sense the tactile stimulation, which was applied to the left (SS-L) or the right (SS-R) wrist. The BCI system was calibrated with the sensation data and then applied for online SAO decoding. Results: Both SS and SAO showed oscillatory activation concentrated on the contralateral somatosensory hemisphere. Offline analysis showed that the proposed calibration method led to greater accuracy than the traditional calibration method based on SAO only. This is confirmed by online experiments, where the online accuracy on 15 subjects was 78.8±13.1%, with 12 subjects >70% and 4 subject >90%. Conclusion: By integrating the stimulus-induced oscillatory dynamics from sensory cortex, covert somatosensory attention can be reliably decoded by a BCI system calibrated with tactile sensation. Significance: Indeed, real tactile sensation is more consistent during calibration than SAO. This brain-computer interfacing approach may find application for stroke and completely locked-in patients with preserved somatic sensation.University Starter Grant of the University of Waterloo (No. 203859) National Natural Science Foundation of China (Grant No. 51620105002

    Sensory threshold neuromuscular electrical stimulation fosters motor imagery performance

    Get PDF
    Motor imagery (MI) has been largely studied as a way to enhance motor learning and to restore motor functions. Although it is agreed that users should emphasize kinesthetic imagery during MI, recordings of MI brain patterns are not sufficiently reliable for many subjects. It has been suggested that the usage of somatosensory feedback would be more suitable than standardly used visual feedback to enhance MI brain patterns. However, somatosensory feedback should not interfere with the recorded MI brain pattern. In this study we propose a novel feedback modality to guide subjects during MI based on sensory threshold neuromuscular electrical stimulation (St-NMES). St-NMES depolarizes sensory and motor axons without eliciting any muscular contraction. We hypothesize that St-NMES does not induce detectable ERD brain patterns and fosters MI performance. Twelve novice subjects were included in a cross-over design study. We recorded their EEG, comparing St-NMES with visual feedback during MI or resting tasks. We found that St-NMES not only induced significantly larger desynchronization over sensorimotor areas (p<0.05) but also significantly enhanced MI brain connectivity patterns. Moreover, classification accuracy and stability were significantly higher with St-NMES. Importantly, St-NMES alone did not induce detectable artifacts, but rather the changes in the detected patterns were due to an increased MI performance. Our findings indicate that St-NMES is a promising feedback in order to foster MI performance and cold be used for BMI online applications

    Sensing with the Motor Cortex

    Get PDF
    The primary motor cortex is a critical node in the network of brain regions responsible for voluntary motor behavior. It has been less appreciated, however, that the motor cortex exhibits sensory responses in a variety of modalities including vision and somatosensation. We review current work that emphasizes the heterogeneity in sensorimotor responses in the motor cortex and focus on its implications for cortical control of movement as well as for brain-machine interface development

    Performance of Brain-Computer Interfacing Based on Tactile Selective Sensation and Motor Imagery

    Get PDF
    © 2017 IEEE.Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.A large proportion of users do not achieve adequate control using current non-invasive Brain-computer Interfaces (BCI). This issue has being coined “BCI-Illiteracy”, and is observed among BCI modalities. Here, we compare the performance and BCI-illiteracy rate of tactile selective sensation (SS) and motor imagery (MI) BCI, for large subject samples. We analyzed 80 experimental sessions from 57 subjects with two-class SS protocols. For SS, the group average performance was 79.8±10.6%, with 43 out of the 57 subjects (75.4%) exceeding the 70% BCI-illiteracy threshold for left and right hand SS discrimination. When compared to previous results, this tactile BCI outperformed all other tactile BCIs currently available. We also analyzed 63 experiment sessions from 43 subjects with two-class MI BCI protocols, where the group average performance was 77.2±13.3%, with 69.7% of the subjects exceeded the 70% performance threshold for left and right hand MI. For within-subject comparison, the 24 subjects who participated to both the SS and MI experiments, the BCI performance was superior with SS than MI especially in beta frequency band (p<0.05), with enhanced R2 discriminative information in the somatosensory cortex for the SS modality. Both SS and MI showed a functional dissociation between lower alpha ([8 10] Hz) and upper alpha ([10 13] Hz) bands, with BCI performance significantly better in the upper alpha than the lower alpha (p<0.05) band. In summary, we demonstrated that SS is a promising BCI modality with low BCI illiteracy issue, and has great potential in practical applications reaching large population.University Starter Grant of the University of Waterloo [No. 203859]National Natural Science Foundation of China [Grant No. 51620105002

    VALIDATION OF A MODEL OF SENSORIMOTOR INTEGRATION WITH CLINICAL BENEFITS

    Get PDF
    Healthy sensorimotor integration – or how our touch influences our movements – is critical to efficiently interact with our environment. Yet, many aspects of this process are still poorly understood. Importantly, several movement disorders are often considered as originating from purely motor impairments, while a sensory origin could also lead to a similar set of symptoms. To alleviate these issues, we hereby propose a novel biologically-based model of the sensorimotor loop, known as the SMILE model. After describing both the functional, and the corresponding neuroanatomical versions of the SMILE, we tested several aspects of its motor component through functional magnetic resonance imaging (fMRI) and transcranial magnetic stimulation (TMS). Both experimental studies resulted in coherent outcomes with respect to the SMILE predictions, but they also provided novel scientific outcomes about such broad topics as the sub-phases of motor imagery, the neural processing of bodily representations, or the extend of the role of the extrastriate body area. In the final sections of this manuscript, we describe some potential clinical application of the SMILE. The first one presents the identification of plausible neuroanatomical origins for focal hand dystonia, a yet poorly understood sensorimotor disorder. The last chapter then covers possible improvements on brain-machine interfaces, driven by a better understanding of the sensorimotor system. -- La façon dont votre sens du toucher et vos mouvements interagissent est connue sous le nom d’intégration sensorimotrice. Ce procédé est essentiel pour une interaction normale avec tout ce qui nous entoure. Cependant, plusieurs aspects de ce processus sont encore méconnus. Plus important encore, l’origine de certaines déficiences motrices encore trop peu comprises sont parfois considérées comme purement motrice, alors qu’une origine sensorielle pourrait mener à un même ensemble de symptômes. Afin d’améliorer cette situation, nous proposons ici un nouveau modèle d’intégration sensorimotrice, dénommé « SMILE », basé sur les connaissances de neurobiologie actuelles. Dans ce manuscrit, nous commençons par décrire les caractéristiques fonctionnelles et neuroanatomiques du SMILE. Plusieurs expériences sont ensuite effectuées, via l’imagerie par résonance magnétique fonctionnelle (IRMf), et la stimulation magnétique transcranienne (SMT), afin de tester différents aspects de la composante motrice du SMILE. Si les résultats de ces expériences corroborent les prédictions du SMILE, elles ont aussi mis en évidences d’autres résultats scientifiques intéressants et novateurs, dans des domaines aussi divers que les sous-phases de l’imagination motrice, les processus cérébraux liés aux représentations corporelles, ou encore l’extension du rôle de l’extrastriate body area. Dans les dernières parties de ce manuscrit, nous dévoilons quelques applications cliniques potentielles de notre modèle. Nous utilisons le SMILE afin de proposer deux origines cérébrales plausibles de la dystonie focale de la main. Le dernier chapitre présente comment certaines technologies existantes, telles que les interfaces cerveaux-machines, pourraient bénéficier d’une meilleure compréhension du système sensorimoteur

    How to Build an Embodiment Lab: Achieving Body Representation Illusions in Virtual Reality

    Get PDF
    Advances in computer graphics algorithms and virtual reality (VR) systems, together with the reduction in cost of associated equipment, have led scientists to consider VR as a useful tool for conducting experimental studies in fields such as neuroscience and experimental psychology. In particular virtual body ownership, where the feeling of ownership over a virtual body is elicited in the participant, has become a useful tool in the study of body representation, in cognitive neuroscience and psychology, concerned with how the brain represents the body. Although VR has been shown to be a useful tool for exploring body ownership illusions, integrating the various technologies necessary for such a system can be daunting. In this paper we discuss the technical infrastructure necessary to achieve virtual embodiment. We describe a basic VR system and how it may be used for this purpose, and then extend this system with the introduction of real-time motion capture, a simple haptics system and the integration of physiological and brain electrical activity recordings

    The role of sensorimotor incongruence in pathological pain

    Get PDF
    corecore