12 research outputs found

    Wireless intraoral tongue control of an assistive robotic arm for individuals with tetraplegia

    Get PDF
    Abstract Background For an individual with tetraplegia assistive robotic arms provide a potentially invaluable opportunity for rehabilitation. However, there is a lack of available control methods to allow these individuals to fully control the assistive arms. Methods Here we show that it is possible for an individual with tetraplegia to use the tongue to fully control all 14 movements of an assistive robotic arm in a three dimensional space using a wireless intraoral control system, thus allowing for numerous activities of daily living. We developed a tongue-based robotic control method incorporating a multi-sensor inductive tongue interface. One abled-bodied individual and one individual with tetraplegia performed a proof of concept study by controlling the robot with their tongue using direct actuator control and endpoint control, respectively. Results After 30 min of training, the able-bodied experimental participant tongue controlled the assistive robot to pick up a roll of tape in 80% of the attempts. Further, the individual with tetraplegia succeeded in fully tongue controlling the assistive robot to reach for and touch a roll of tape in 100% of the attempts and to pick up the roll in 50% of the attempts. Furthermore, she controlled the robot to grasp a bottle of water and pour its contents into a cup; her first functional action in 19 years. Conclusion To our knowledge, this is the first time that an individual with tetraplegia has been able to fully control an assistive robotic arm using a wireless intraoral tongue interface. The tongue interface used to control the robot is currently available for control of computers and of powered wheelchairs, and the robot employed in this study is also commercially available. Therefore, the presented results may translate into available solutions within reasonable time

    Hybrid Tongue-Myoelectric Control Improves Functional Use of a Robotic Hand Prosthesis

    Get PDF

    Testing silicone digit extensions as a way to suppress natural sensation to evaluate supplementary tactile feedback

    Get PDF
    Dexterous use of the hands depends critically on sensory feedback, so it is generally agreed that functional supplementary feedback would greatly improve the use of hand prostheses. Much research still focuses on improving non-invasive feedback that could potentially become available to all prosthesis users. However, few studies on supplementary tactile feedback for hand prostheses demonstrated a functional benefit. We suggest that confounding factors impede accurate assessment of feedback, e.g., testing non-amputee participants that inevitably focus intently on learning EMG control, the EMG’s susceptibility to noise and delays, and the limited dexterity of hand prostheses. In an attempt to assess the effect of feedback free from these constraints, we used silicone digit extensions to suppress natural tactile feedback from the fingertips and thus used the tactile feedback-deprived human hand as an approximation of an ideal feed-forward tool. Our non-amputee participants wore the extensions and performed a simple pick-and-lift task with known weight, followed by a more difficult pick-and-lift task with changing weight. They then repeated these tasks with one of three kinds of audio feedback. The tests were repeated over three days. We also conducted a similar experiment on a person with severe sensory neuropathy to test the feedback without the extensions. Furthermore, we used a questionnaire based on the NASA Task Load Index to gauge the subjective experience. Unexpectedly, we did not find any meaningful differences between the feedback groups, neither in the objective nor the subjective measurements. It is possible that the digit extensions did not fully suppress sensation, but since the participant with impaired sensation also did not improve with the supplementary feedback, we conclude that the feedback failed to provide relevant grasping information in our experiments. The study highlights the complex interaction between task, feedback variable, feedback delivery, and control, which seemingly rendered even rich, high-bandwidth acoustic feedback redundant, despite substantial sensory impairment

    Tongue Control of Upper-Limb Exoskeletons For Individuals With Tetraplegia

    Get PDF

    Développement d’algorithmes et d’outils logiciels pour l’assistance technique et le suivi en réadaptation

    Get PDF
    Ce mémoire présente deux projets de développement portant sur des algorithmes et des outils logiciels offrant des solutions pratiques à des problématiques courantes rencontrées en réadaptation. Le premier développement présenté est un algorithme de correspondance de séquence qui s’intègre à des interfaces de contrôle couramment utilisées en pratique. L’implémentation de cet algorithme offre une solution flexible pouvant s’adapter à n’importe quel utilisateur de technologies d’assistances. Le contrôle de tels appareils représente un défi de taille puisqu’ils ont, la plupart du temps, une dimensionnalité élevée (c-à-d. plusieurs degrés de liberté, modes ou commandes) et sont maniés à l’aide d’interfaces basées sur de capteurs de faible dimensionnalité offrant donc très peu de commandes physiques distinctes pour l’utilisateur. L’algorithme proposé se base donc sur de la reconnaissance de courts signaux temporels ayant la possibilité d’être agencés en séquences. L’éventail de combinaisons possibles augmente ainsi la dimensionnalité de l’interface. Deux applications de l’algorithme sont développées et testées. La première avec une interface de contrôle par le souffle pour un bras robotisé et la seconde pour une interface de gestes de la main pour le contrôle du clavier-souris d’un ordinateur. Le second développement présenté dans ce mémoire porte plutôt sur la collecte et l’analyse de données en réadaptation. Que ce soit en milieux cliniques, au laboratoires ou au domicile, nombreuses sont les situations où l’on souhaite récolter des données. La solution pour cette problématique se présente sous la forme d’un écosystème d’applications connectées incluant serveur et applications web, mobiles et embarquée. Ces outils logiciels sont développés sur mesure et offrent un procédé unique, peu coûteux, léger et rapide pour la collecte, la visualisation et la récupération de données. Ce manuscrit détaille une première version en décrivant l’architecture employée, les technologies utilisées et les raisons qui ont mené à ces choix tout en guidant les futures itérations.This Master’s thesis presents two development projects about algorithms and software tools providing practical solutions to commonly faced situations in rehabilitation context. The first project is the development of a sequence matching algorithm that can be integrated to the most commonly used control interfaces. The implementation of this algorithm provides a flexible solution that can be adapted to any assistive technology user. The control of such devices represents a challenge since their dimensionality is high (i.e., many degrees of freedom, modes, commands) and they are controlled with interfaces based on low-dimensionality sensors. Thus, the number of actual physical commands that the user can perform is low. The proposed algorithm is based on short time signals that can be organized into sequences. The multiple possible combinations then contribute to increasing the dimensionality of the interface. Two applications of the algorithm have been developed and tested. The first is a sip-and-puff control interface for a robotic assistive arm and the second is a hand gesture interface for the control of a computer’s mouse and keyboard. The second project presented in this document addresses the issue of collecting and analyzing data. In a rehabilitation’s clinical or laboratory environment, or at home, there are many situations that require gathering data. The proposed solution to this issue is a connected applications ecosystem that includes a web server and mobile, web and embedded applications. This custom-made software offers a unique, inexpensive, lightweight and fast workflow to visualize and retrieve data. The following document describes a first version by elaborating on the architecture, the technologies used, the reasons for those choices, and guide the next iterations

    Control of a Robotic Hand Using a Tongue Control System-A Prosthesis Application

    No full text
    Objective: The aim of this study was to investigate the feasibility of using an inductive tongue control system (ITCS) for controlling robotic/prosthetic hands and arms. Methods: This study presents a novel dual modal control scheme for multigrasp robotic hands combining standard electromyogram (EMG) with the ITCS. The performance of the ITCS control scheme was evaluated in a comparative study. Ten healthy subjects used both the ITCS control scheme and a conventional EMG control scheme to complete grasping exercises with the IH1 Azzurra robotic hand implementing five grasps. Time to activate a desired function or grasp was used as the performance metric. Results: Statistically significant differences were found when comparing the performance of the two control schemes. On average, the ITCS control scheme was 1.15 s faster than the EMG control scheme, corresponding to a 35.4% reduction in the activation time. The largest difference was for grasp 5 with a mean AT reduction of 45.3% (2.38 s). Conclusion: The findings indicate that using the ITCS control scheme could allow for faster activation of specific grasps or functions compared with a conventional EMG control scheme. Significance: For transhumeral and especially bilateral amputees, the ITCS control scheme could have a significant impact on the prosthesis control. In addition, the ITCS would provide bilateral amputees with the additional advantage of environmental and computer control for which the ITCS was originally developed

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice
    corecore