142 research outputs found

    TASK-SPECIFIC VIRTUAL TRAINING FOR IMPROVED PATTERN RECOGNITION-BASED PROSTHESES CONTROL

    Get PDF
    The emergence of dexterous prostheses presents the potential to significantly improve amputees’ quality of life. The use of intuitive pattern recognition algorithm is among the most promising control strategy for dexterous prostheses, with the demonstration of near perfect classification accuracies in laboratory settings. However, recent literatures show a weak correlation between classification accuracy and usability of the prostheses. External factors such as varying limb positions affect electromyography signals and consequently deteriorate usability of the prostheses; therefore, task-specific user training is proposed to enhance usability of the pattern recognition-based prostheses. Eight able-bodied subjects and one transradial amputee subject participated in the study to validate the efficacy of task-specific virtual training and examine the relationship between the virtual reality and real-world environment performance of prostheses use. Subjects were evaluated in 2 functional tests, Modified Box and Block Test and Reach-Grasp-Release Test, in both virtual and real-world environment, and received five sessions of one-on-one virtual training that lasted for one hour. Subjects were evaluated once again after completing five virtual training sessions and showed a significant improvement in functional tests. The amputee subject, despite the fact that he had been a pattern recognition- based prosthesis wearer for 5 months, also showed improvement upon virtual training, especially in the test that enforced him to use his prosthesis in postures that are outside of his usual range. In addition, no statistically significant difference was observed between the performance in virtual reality and real-world environment, indicating the potential for virtual reality evaluation to be a diagnostic tool to determine individual’s usability of pattern recognition-based myoelectric prostheses. It was shown that high classification accuracy alone does not guarantee proficiency in prostheses control; rather, it only represented the capacity of one’s prostheses control. To effectively prepare amputees for pattern recognition-based myoelectric prostheses control in activities of daily living, task-specific virtual training should be administered prior to prosthesis fitting. For future study, the integration of accurate, stable motion tracking system with head-mounted display is suggested for more immersive experience that enables users to practice proper positioning of the terminal device, an essential skill for object interaction with prostheses

    A Review of Control Strategies in Closed-Loop Neuroprosthetic Systems

    Get PDF
    It has been widely recognized that closed-loop neuroprosthetic systems achieve more favourable outcomes for users then equivalent open-loop devices. Improved performance of tasks, better usability and greater embodiment have all been reported in systems utilizing some form of feedback. However the interdisciplinary work on neuroprosthetic systems can lead to miscommunication due to similarities in well established nomenclature in different fields. Here we present a review of control strategies in existing experimental, investigational and clinical neuroprosthetic systems in order to establish a baseline and promote a common understanding of different feedback modes and closed loop controllers. The first section provides a brief discussion of feedback control and control theory. The second section reviews the control strategies of recent Brain Machine Interfaces, neuromodulatory implants, neuroprosthetic systems and assistive neurorobotic devices. The final section examines the different approaches to feedback in current neuroprosthetic and neurorobotic systems

    Are We the Robots? : Man-Machine Integration

    Get PDF
    We experience and interact with the world through our body. The founding father of computer science, Alan Turing, correctly realized that one of the most important features of the human being is the interaction between mind and body. Since the original demonstration that electrical activity of the cortical neurons can be employed to directly control a robotic device, the research on the so-called Brain-Machine Interfaces (BMIs) has impressively grown. For example, current BMIs dedicated to both experimental and clinical studies can translate raw neuronal signals into computational commands to reproduce reaching or grasping in artificial actuators. These developments hold promise for the restoration of limb mobility in paralyzed individuals. However, as the authors review in this chapter, before this goal can be achieved, several hurdles have to be overcome, including developments in real-time computational algorithms and in designing fully implantable and biocompatible devices. Future investigations will have to address the best solutions for restoring sensation to the prosthetic limb, which still remains a major challenge to full integration of the limb into the user's self-image

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Towards Natural Control of Artificial Limbs

    Get PDF
    The use of implantable electrodes has been long thought as the solution for a more natural control of artificial limbs, as these offer access to long-term stable and physiologically appropriate sources of control, as well as the possibility to elicit appropriate sensory feedback via neurostimulation. Although these ideas have been explored since the 1960’s, the lack of a long-term stable human-machine interface has prevented the utilization of even the simplest implanted electrodes in clinically viable limb prostheses.In this thesis, a novel human-machine interface for bidirectional communication between implanted electrodes and the artificial limb was developed and clinically implemented. The long-term stability was achieved via osseointegration, which has been shown to provide stable skeletal attachment. By enhancing this technology as a communication gateway, the longest clinical implementation of prosthetic control sourced by implanted electrodes has been achieved, as well as the first in modern times. The first recipient has used it uninterruptedly in daily and professional activities for over one year. Prosthetic control was found to improve in resolution while requiring less muscular effort, as well as to be resilient to motion artifacts, limb position, and environmental conditions.In order to support this work, the literature was reviewed in search of reliable and safe neuromuscular electrodes that could be immediately used in humans. Additional work was conducted to improve the signal-to-noise ratio and increase the amount of information retrievable from extraneural recordings. Different signal processing and pattern recognition algorithms were investigated and further developed towards real-time and simultaneous prediction of limb movements. These algorithms were used to demonstrate that higher functionality could be restored by intuitive control of distal joints, and that such control remains viable over time when using epimysial electrodes. Lastly, the long-term viability of direct nerve stimulation to produce intuitive sensory feedback was also demonstrated.The possibility to permanently and reliably access implanted electrodes, thus making them viable for prosthetic control, is potentially the main contribution of this work. Furthermore, the opportunity to chronically record and stimulate the neuromuscular system offers new venues for the prediction of complex limb motions and increased understanding of somatosensory perception. Therefore, the technology developed here, combining stable attachment with permanent and reliable human-machine communication, is considered by the author as a critical step towards more functional artificial limbs

    HoloPHAM: An Augmented Reality Training System For Upper Limb Myoelectric Prosthesis Users

    Get PDF
    From hook-shaped prosthetic devices to myoelectric prostheses with increased functional capabilities such as the Modular Prosthetic Limb (MPL), upper limb prostheses have come a long way. However, user acceptance rate does not show a similar increasing trend. Functional use training is incorporated into occupational therapy for myoelectric prosthesis users to bridge this gap. Advancements in technology for virtual and augmented reality enable the application of immersive virtual environments in prosthesis user training. Such training systems have been shown to result in higher user performance and participation in training exercises. The work presented here introduces the application of augmented reality (AR) in myoelectric prosthesis user training. This was done through the development of HoloPHAM, an AR training tool designed to mimic a real-world training protocol called Prosthetic Hand Assessment Measure (PHAM). This AR system was built for use with the Microsoft HoloLens, thus requiring a motion tracking system that could enable the user to move around freely in a room. The Bluetooth Orientation Tracking System (BOTS) was developed as an inertial measurement unit (IMU)-based wireless motion tracking system for this purpose. Performance of BOTS as a motion tracker was evaluated by comparison with the Microsoft Kinect sensor. Results showed that BOTS out-performed the Kinect sensor as a motion tracking system for our intended application in HoloPHAM. BOTS and the Myo armband were combined to form a human-machine interface (HMI) to control the virtual arm of HoloPHAM, enabling virtual object manipulation. This HMI along with the virtual PHAM set-up makes HoloPHAM a portable AR training environment that can be applied for prosthesis user training or evaluation of new myoelectric control strategies
    • 

    corecore