50 research outputs found

    Designing a Virtual Reality Myoelectric Prosthesis Training System for Amputees

    Get PDF
    Electrical signals produced by muscle contractions are found to be effective in controlling accurately artificial limbs. Myoelectric-powered can be more functional and advantageous compared to passive or body-powered prostheses, however extensive training is required to take full advantage of the myoelectric prosthesis' usability. In recent years, computer technology has brought new opportunities for improving patients' training, resulting in more usable and functional solutions. Virtual Reality (VR) is a representative example of this type of technology. These preliminary findings suggested that myoelectric-powered training enhanced with VR can simulate a pain-free, natural, enjoyable, and realistic experience for the patient. It was also suggested that VR can complement prosthesis training, by improving the functionality of the missing body part. Finally, it was shown that VR can resolve one of the most common challenges for a new prosthesis user, which is to accept the fitting of the prosthetic device to their own body

    Behavioral Learning in a Cognitive Neuromorphic Robot: An Integrative Approach

    Get PDF
    We present here a learning system using the iCub humanoid robot and the SpiNNaker neuromorphic chip to solve the real-world task of object-specific attention. Integrating spiking neural networks with robots introduces considerable complexity for questionable benefit if the objective is simply task performance. But, we suggest, in a cognitive robotics context, where the goal is understanding how to compute, such an approach may yield useful insights to neural architecture as well as learned behavior, especially if dedicated neural hardware is available. Recent advances in cognitive robotics and neuromorphic processing now make such systems possible. Using a scalable, structured, modular approach, we build a spiking neural network where the effects and impact of learning can be predicted and tested, and the network can be scaled or extended to new tasks automatically. We introduce several enhancements to a basic network and show how they can be used to direct performance toward behaviorally relevant goals. Results show that using a simple classical spike-timing-dependent plasticity (STDP) rule on selected connections, we can get the robot (and network) to progress from poor task-specific performance to good performance. Behaviorally relevant STDP appears to contribute strongly to positive learning: “do this” but less to negative learning: “don't do that.” In addition, we observe that the effect of structural enhancements tends to be cumulative. The overall system suggests that it is by being able to exploit combinations of effects, rather than any one effect or property in isolation, that spiking networks can achieve compelling, task-relevant behavior

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    From rubber hands to neuroprosthetics: Neural correlates of embodiment

    Get PDF
    © 2023 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/)Our interaction with the world rests on the knowledge that we are a body in space and time, which can interact with the environment. This awareness is usually referred to as sense of embodiment. For the good part of the past 30 years, the rubber hand illusion (RHI) has been a prime tool to study embodiment in healthy and people with a variety of clinical conditions. In this paper, we provide a critical overview of this research with a focus on the RHI paradigm as a tool to study prothesis embodiment in individuals with amputation. The RHI relies on well-documented multisensory integration mechanisms based on sensory precision, where parietal areas are involved in resolving the visuo-tactile conflict, and premotor areas in updating the conscious bodily representation. This mechanism may be transferable to prosthesis ownership in amputees. We discuss how these results might transfer to technological development of sensorised prostheses, which in turn might progress the acceptability by users.Peer reviewe

    Can wearable haptic devices foster the embodiment of virtual limbs?

    Get PDF
    Increasing presence is one of the primary goals of virtual reality research. A crucial aspect is that users are capable of distinguishing their self from the external virtual world. The hypothesis we investigate is that wearable haptics play an important role in the body experience and could thereby contribute to the immersion of the user in the virtual environment. A within-subject study (n=32) comparing the embodiment of a virtual hand with different implementations of haptic feedback (force feedback, vibrotactile feedback, and no haptic feedback) is presented. Participants wore a glove with haptic feedback devices at thumb and index finger. They were asked to put virtual cubes on a moving virtual target. Touching a virtual object caused vibrotactile-feedback, force-feedback or no feedback depending on the condition. These conditions were provided both synchronously and asynchronously. Embodiment was assessed quantitatively with the proprioceptive drift and subjectively via a questionnaire. Results show that haptic feedback significantly improves the subjective embodiment of a virtual hand and that force feedback leads to stronger responses to certain subscales of subjective embodiment. These outcomes are useful guidelines for wearable haptic designer and represent a basis for further research concerning human body experience, in reality, and in virtual environments

    A Human-Centric Metaverse Enabled by Brain-Computer Interface: A Survey

    Full text link
    The growing interest in the Metaverse has generated momentum for members of academia and industry to innovate toward realizing the Metaverse world. The Metaverse is a unique, continuous, and shared virtual world where humans embody a digital form within an online platform. Through a digital avatar, Metaverse users should have a perceptual presence within the environment and can interact and control the virtual world around them. Thus, a human-centric design is a crucial element of the Metaverse. The human users are not only the central entity but also the source of multi-sensory data that can be used to enrich the Metaverse ecosystem. In this survey, we study the potential applications of Brain-Computer Interface (BCI) technologies that can enhance the experience of Metaverse users. By directly communicating with the human brain, the most complex organ in the human body, BCI technologies hold the potential for the most intuitive human-machine system operating at the speed of thought. BCI technologies can enable various innovative applications for the Metaverse through this neural pathway, such as user cognitive state monitoring, digital avatar control, virtual interactions, and imagined speech communications. This survey first outlines the fundamental background of the Metaverse and BCI technologies. We then discuss the current challenges of the Metaverse that can potentially be addressed by BCI, such as motion sickness when users experience virtual environments or the negative emotional states of users in immersive virtual applications. After that, we propose and discuss a new research direction called Human Digital Twin, in which digital twins can create an intelligent and interactable avatar from the user's brain signals. We also present the challenges and potential solutions in synchronizing and communicating between virtual and physical entities in the Metaverse

    Sensorimotor Representation Learning for an “Active Self” in Robots: A Model Survey

    Get PDF
    Safe human-robot interactions require robots to be able to learn how to behave appropriately in spaces populated by people and thus to cope with the challenges posed by our dynamic and unstructured environment, rather than being provided a rigid set of rules for operations. In humans, these capabilities are thought to be related to our ability to perceive our body in space, sensing the location of our limbs during movement, being aware of other objects and agents, and controlling our body parts to interact with them intentionally. Toward the next generation of robots with bio-inspired capacities, in this paper, we first review the developmental processes of underlying mechanisms of these abilities: The sensory representations of body schema, peripersonal space, and the active self in humans. Second, we provide a survey of robotics models of these sensory representations and robotics models of the self; and we compare these models with the human counterparts. Finally, we analyze what is missing from these robotics models and propose a theoretical computational framework, which aims to allow the emergence of the sense of self in artificial agents by developing sensory representations through self-exploration.Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659Projekt DEALPeer Reviewe

    A brain-computer interface integrated with virtual reality and robotic exoskeletons for enhanced visual and kinaesthetic stimuli

    Get PDF
    Brain-computer interfaces (BCI) allow the direct control of robotic devices for neurorehabilitation and measure brain activity patterns following the user’s intent. In the past two decades, the use of non-invasive techniques such as electroencephalography and motor imagery in BCI has gained traction. However, many of the mechanisms that drive the proficiency of humans in eliciting discernible signals for BCI remains unestablished. The main objective of this thesis is to explore and assess what improvements can be made for an integrated BCI-robotic system for hand rehabilitation. Chapter 2 presents a systematic review of BCI-hand robot systems developed from 2010 to late 2019 in terms of their technical and clinical reports. Around 30 studies were identified as eligible for review and among these, 19 were still in their prototype or pre-clinical stages of development. A degree of inferiority was observed from these systems in providing the necessary visual and kinaesthetic stimuli during motor imagery BCI training. Chapter 3 discusses the theoretical background to arrive at a hypothesis that an enhanced visual and kinaesthetic stimulus, through a virtual reality (VR) game environment and a robotic hand exoskeleton, will improve motor imagery BCI performance in terms of online classification accuracy, class prediction probabilities, and electroencephalography signals. Chapters 4 and 5 focus on designing, developing, integrating, and testing a BCI-VR-robot prototype to address the research aims. Chapter 6 tests the hypothesis by performing a motor imagery BCI paradigm self-experiment with an enhanced visual and kinaesthetic stimulus against a control. A significant increase (p = 0.0422) in classification accuracies is reported among groups with enhanced visual stimulus through VR versus those without. Six out of eight sessions among the VR groups have a median of class probability values exceeding a pre-set threshold value of 0.6. Finally, the thesis concludes in Chapter 7 with a general discussion on how these findings could suggest the role of new and emerging technologies such as VR and robotics in advancing BCI-robotic systems and how the contributions of this work may help improve the usability and accessibility of such systems, not only in rehabilitation but also in skills learning and education
    corecore