38 research outputs found

    Accurate Prosthetic Hand

    Get PDF
    The purpose of this project is to explore a method to improve the dexterity of artificial hands by closely mimicking the biomechanics of a human hand. The mechanical system of the device is actuated using several stepper motors controlled by electroencephalogram (EEG) and electromyograph (EMG) signals. The majority of the device’s motions are controlled using EEG, with three distinct thoughts executing three distinct grips: pinch, hook, and point. EMG signals are used for finer motor control, such as controlling the strength of each grip pattern. The completion of this project resulted in a prosthetic hand prototype capable of nine degrees of freedom as well as the creation of a control system that relies on sensory input from the mind and body

    Application of Artificial Intelligence (AI) in Prosthetic and Orthotic Rehabilitation

    Get PDF
    Technological integration of Artificial Intelligence (AI) and machine learning in the Prosthetic and Orthotic industry and in the field of assistive technology has become boon for the Persons with Disabilities. The concept of neural network has been used by the leading manufacturers of rehabilitation aids for simulating various anatomical and biomechanical functions of the lost parts of the human body. The involvement of human interaction with various agents’ i.e. electronic circuitry, software, robotics, etc. has made a revolutionary impact in the rehabilitation field to develop devices like Bionic leg, mind or thought control prosthesis and exoskeletons. Application of Artificial Intelligence and robotics technology has a huge impact in achieving independent mobility and enhances the quality of life in Persons with Disabilities (PwDs)

    Pattern recognition-based real-time myoelectric control for anthropomorphic robotic systems : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Mechatronics at Massey University, Manawatū, New Zealand

    Get PDF
    All copyrighted Figures have been removed but may be accessed via their source cited in their respective captions.Advanced human-computer interaction (HCI) or human-machine interaction (HMI) aims to help humans interact with computers smartly. Biosignal-based technology is one of the most promising approaches in developing intelligent HCI systems. As a means of convenient and non-invasive biosignal-based intelligent control, myoelectric control identifies human movement intentions from electromyogram (EMG) signals recorded on muscles to realise intelligent control of robotic systems. Although the history of myoelectric control research has been more than half a century, commercial myoelectric-controlled devices are still mostly based on those early threshold-based methods. The emerging pattern recognition-based myoelectric control has remained an active research topic in laboratories because of insufficient reliability and robustness. This research focuses on pattern recognition-based myoelectric control. Up to now, most of effort in pattern recognition-based myoelectric control research has been invested in improving EMG pattern classification accuracy. However, high classification accuracy cannot directly lead to high controllability and usability for EMG-driven systems. This suggests that a complete system that is composed of relevant modules, including EMG acquisition, pattern recognition-based gesture discrimination, output equipment and its controller, is desirable and helpful as a developing and validating platform that is able to closely emulate real-world situations to promote research in myoelectric control. This research aims at investigating feasible and effective EMG signal processing and pattern recognition methods to extract useful information contained in EMG signals to establish an intelligent, compact and economical biosignal-based robotic control system. The research work includes in-depth study on existing pattern recognition-based methodologies, investigation on effective EMG signal capturing and data processing, EMG-based control system development, and anthropomorphic robotic hand design. The contributions of this research are mainly in following three aspects: Developed precision electronic surface EMG (sEMG) acquisition methods that are able to collect high quality sEMG signals. The first method was designed in a single-ended signalling manner by using monolithic instrumentation amplifiers to determine and evaluate the analog sEMG signal processing chain architecture and circuit parameters. This method was then evolved into a fully differential analog sEMG detection and collection method that uses common commercial electronic components to implement all analog sEMG amplification and filtering stages in a fully differential way. The proposed fully differential sEMG detection and collection method is capable of offering a higher signal-to-noise ratio in noisy environments than the single-ended method by making full use of inherent common-mode noise rejection capability of balanced signalling. To the best of my knowledge, the literature study has not found similar methods that implement the entire analog sEMG amplification and filtering chain in a fully differential way by using common commercial electronic components. Investigated and developed a reliable EMG pattern recognition-based real-time gesture discrimination approach. Necessary functional modules for real-time gesture discrimination were identified and implemented using appropriate algorithms. Special attention was paid to the investigation and comparison of representative features and classifiers for improving accuracy and robustness. A novel EMG feature set was proposed to improve the performance of EMG pattern recognition. Designed an anthropomorphic robotic hand construction methodology for myoelectric control validation on a physical platform similar to in real-world situations. The natural anatomical structure of the human hand was imitated to kinematically model the robotic hand. The proposed robotic hand is a highly underactuated mechanism, featuring 14 degrees of freedom and three degrees of actuation. This research carried out an in-depth investigation into EMG data acquisition and EMG signal pattern recognition. A series of experiments were conducted in EMG signal processing and system development. The final myoelectric-controlled robotic hand system and the system testing confirmed the effectiveness of the proposed methods for surface EMG acquisition and human hand gesture discrimination. To verify and demonstrate the proposed myoelectric control system, real-time tests were conducted onto the anthropomorphic prototype robotic hand. Currently, the system is able to identify five patterns in real time, including hand open, hand close, wrist flexion, wrist extension and the rest state. With more motion patterns added in, this system has the potential to identify more hand movements. The research has generated a few journal and international conference publications

    Desenvolvimento e avaliação de interfaces homem-máquina para o controle e atuação de próteses de membros superiores

    Get PDF
    Orientador: Eric RohmerDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: O objetivo desta dissertação é desenvolver e avaliar interfaces para o controle e atuação de próteses de mão. As interfaces desenvolvidas combinam sinais de eletromiografia (EMG) com identificação de rádiofrequência (Módulo RFID), Unidade de Medida Inercial (Módulo de movimento) ou técnicas de visão computacional (Módulo de visão) para selecionar os tipos de interação. Os sinais EMG são responsáveis por desencadear o sistema, enquanto os outros sensores são responsáveis pela seleção da preensão para que o usuário possa interagir com o ambiente. As interações do usuário com a prótese podem ser vistas em uma simulação. A avaliação das três interfaces foi realizada utilizando o Nasa Task Load Index, que acessa a carga de trabalho dos usuários ao usar o sistema para executar tarefas. Essa avaliação acessa níveis de Demanda Mental, Demanda Física, Demanda Temporal, Esforço, Desempenho e Frustração para calcular a carga de trabalho geral das tarefas. Os resultados mostram que o Módulo RFID é a interface que requer menos esforço cognitivo do usuário, seguido pelo Módulo Visão e o Módulo de Movimento, respectivamente. Adicionalmente, o fato de os usuários das interfaces não necessitarem realizar várias co-contrações, como acontece nos sistemas mioelétricos, reduz sua carga cognitiva. Uma tabela comparativa das três interfaces enfatiza as vantagens e desvantagens de cada interface em um ambiente instrumentado e não instrumentadoAbstract: The purpose of this dissertation is to develop and evaluate interfaces for controlling and actuation of prosthetic hands. The interfaces developed combine Electromyography signals (EMG) with Radio Frequency Identification (RFID Module), Inertial Measurement Unit sensor (Motion Module) or Computer Vision techniques (Vision Module) to select the types of interaction. The EMG signals are responsible for triggering the system while the other sensors are responsible for the selection of the grasp so the user can interact with the environment. The user interactions with a prosthesis can be seen in a simulation of the prosthesis. The evaluation of the three interfaces was conducted using the Nasa Task Load Index, that accesses the workload of the users while using the system to perform tasks. This evaluation access levels of Mental Demand, Physical Demand, Temporal Demand, Effort, Performance, and Frustration to calculate the overall workload of the tasks. As the results show, the RFID Module is the interface that requires less cognitive effort from the user, followed by the Vision Module and Motion Module, respectively. Additionally, the fact that the users of the interfaces do not need to perform various co-contractions as happens on myoelectric systems reduces their cognitive burden. A comparative table of the three interfaces emphasises the advantages and disadvantages of each interface in a controllable and no-controllable environmentMestradoEngenharia de ComputaçãoMestra em Engenharia Elétric

    Data-driven Mechanical Design and Control Method of Dexterous Upper-Limb Prosthesis

    Get PDF
    With an increasing number of people, 320,000 per year, suffering from impaired upper limb function due to various medical conditions like stroke and blunt trauma, the demand for highly functional upper limb prostheses is increasing; however, the rates of rejection of prostheses are high due to factors such as lack of functionality, high cost, weight, and lack of sensory feedback. Modern robotics has led to the development of more affordable and dexterous upper limb prostheses with mostly anthropomorphic designs. However, due to the highly sophisticated ergonomics of anthropomorphic hands, most are economically prohibitive and suffer from control complexity due to increased cognitive load on the user. Thus, this thesis work aims to design a prosthesis that relies on the emulation of the kinematics and contact forces involved in grasping tasks with healthy human hands rather than on biomimicry for reduction of mechanical complexity and utilization of technologically advanced engineering components. This is accomplished by 1) experimentally characterizing human grasp kinematics and kinetics as a basis for data-driven prosthesis design. Using the grasp data, steps are taken to 2) develop a data-driven design and control method of an upper limb prosthesis that shares the kinematics and kinetics required for healthy human grasps without taking the anthropomorphic design. This thesis demonstrates an approach to decrease the gap between the functionality of the human hand and robotic upper limb prostheses by introducing a method to optimize the design and control method of an upper limb prosthesis. This is accomplished by first, collecting grasp data from human subjects with a motion and force capture glove. The collected data are used to minimize control complexity by reducing the dimensionality of the device while fulfilling the kinematic and kinetic requirements of daily grasping tasks. Using these techniques, a task-oriented upper limb prosthesis is prototyped and tested in simulation and physical environment.Ph.D

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    A brain-computer interface integrated with virtual reality and robotic exoskeletons for enhanced visual and kinaesthetic stimuli

    Get PDF
    Brain-computer interfaces (BCI) allow the direct control of robotic devices for neurorehabilitation and measure brain activity patterns following the user’s intent. In the past two decades, the use of non-invasive techniques such as electroencephalography and motor imagery in BCI has gained traction. However, many of the mechanisms that drive the proficiency of humans in eliciting discernible signals for BCI remains unestablished. The main objective of this thesis is to explore and assess what improvements can be made for an integrated BCI-robotic system for hand rehabilitation. Chapter 2 presents a systematic review of BCI-hand robot systems developed from 2010 to late 2019 in terms of their technical and clinical reports. Around 30 studies were identified as eligible for review and among these, 19 were still in their prototype or pre-clinical stages of development. A degree of inferiority was observed from these systems in providing the necessary visual and kinaesthetic stimuli during motor imagery BCI training. Chapter 3 discusses the theoretical background to arrive at a hypothesis that an enhanced visual and kinaesthetic stimulus, through a virtual reality (VR) game environment and a robotic hand exoskeleton, will improve motor imagery BCI performance in terms of online classification accuracy, class prediction probabilities, and electroencephalography signals. Chapters 4 and 5 focus on designing, developing, integrating, and testing a BCI-VR-robot prototype to address the research aims. Chapter 6 tests the hypothesis by performing a motor imagery BCI paradigm self-experiment with an enhanced visual and kinaesthetic stimulus against a control. A significant increase (p = 0.0422) in classification accuracies is reported among groups with enhanced visual stimulus through VR versus those without. Six out of eight sessions among the VR groups have a median of class probability values exceeding a pre-set threshold value of 0.6. Finally, the thesis concludes in Chapter 7 with a general discussion on how these findings could suggest the role of new and emerging technologies such as VR and robotics in advancing BCI-robotic systems and how the contributions of this work may help improve the usability and accessibility of such systems, not only in rehabilitation but also in skills learning and education

    Humanoid Robots

    Get PDF
    For many years, the human being has been trying, in all ways, to recreate the complex mechanisms that form the human body. Such task is extremely complicated and the results are not totally satisfactory. However, with increasing technological advances based on theoretical and experimental researches, man gets, in a way, to copy or to imitate some systems of the human body. These researches not only intended to create humanoid robots, great part of them constituting autonomous systems, but also, in some way, to offer a higher knowledge of the systems that form the human body, objectifying possible applications in the technology of rehabilitation of human beings, gathering in a whole studies related not only to Robotics, but also to Biomechanics, Biomimmetics, Cybernetics, among other areas. This book presents a series of researches inspired by this ideal, carried through by various researchers worldwide, looking for to analyze and to discuss diverse subjects related to humanoid robots. The presented contributions explore aspects about robotic hands, learning, language, vision and locomotion
    corecore