102 research outputs found

    Analysis of ANN and Fuzzy Logic Dynamic Modelling to Control the Wrist Exoskeleton

    Get PDF
    Human intention has long been a primary emphasis in the field of electromyography (EMG) research. This being considered, the movement of the exoskeleton hand can be accurately predicted based on the user's preferences. The EMG is a nonlinear signal formed by muscle contractions as the human hand moves and easily captured noise signal from its surroundings. Due to this fact, this study aims to estimate wrist desired velocity based on EMG signals using ANN and FL mapping methods. The output was derived using EMG signals and wrist position were directly proportional to control wrist desired velocity. Ten male subjects, ranging in age from 21 to 40, supplied EMG signal data set used for estimating the output in single and double muscles experiments. To validate the performance, a physical model of an exoskeleton hand was created using Sim-mechanics program tool. The ANN used Levenberg training method with 1 hidden layer and 10 neurons, while FL used a triangular membership function to represent muscles contraction signals amplitude at different MVC levels for each wrist position. As a result, PID was substituted to compensate fluctuation of mapping outputs, resulting in a smoother signal reading while improving the estimation of wrist desired velocity performance. As a conclusion, ANN compensates for complex nonlinear input to estimate output, but it works best with large data sets. FL allowed designers to design rules based on their knowledge, but the system will struggle due to the large number of inputs. Based on the results achieved, FL was able to show a distinct separation of wrist desired velocity hand movement when compared to ANN for similar testing datasets due to the decision making based on rules setting setup by the designer

    The Role of Machine Learning in Improved Functionality of Lower Limb Prostheses

    Get PDF
    Lower-limb amputations can cause a plethora of obstacles that lead to a lower quality of life. Implementing machine learning techniques means advanced prosthetics can contribute to facilitating the lives of those that live with lower-limb amputations. Using the publicly available HuGaDB data set, the current study investigates several classification models (random forest, neural network, and Vowpal Wabbit) to predict the locomotive intentions of individuals using lower-limb prostheses. The results of this study show that the neural network model yielded the highest accuracy, comparable precision, and recall scores to the other models. However, the Vowpal Wabbit model\u27s advantage in speed may allow for other, more practical implementations in practice. These findings provide insight into the advantages of specific classification models over others in predicting the intentions of specific movements during locomotive transitions. These findings present direct comparisons of several machine learning methods, identifying the strengths and weaknesses of each classification model tested

    Investigating motor skill in closed-loop myoelectric hand prostheses:Through speed-accuracy trade-offs

    Get PDF

    Variational Autoencoder and Sensor Fusion for Robust Myoelectric Controls

    Get PDF
    Myoelectric control schemes aim to utilize the surface electromyography (EMG) signals which are the electric potentials directly measured from skeletal muscles to control wearable robots such as exoskeletons and prostheses. The main challenge of myoelectric controls is to increase and preserve the signal quality by minimizing the effect of confounding factors such as muscle fatigue or electrode shift. Current research in myoelectric control schemes are developed to work in ideal laboratory conditions, but there is a persistent need to have these control schemes be more robust and work in real-world environments. Following the manifold hypothesis, complexity in the world can be broken down from a high-dimensional space to a lower-dimensional form or representation that can explain how the higher-dimensional real world operates. From this premise, the biological actions and their relevant multimodal signals can be compressed and optimally pertinent when performed in both laboratory and non-laboratory settings once the learned representation or manifold is discovered. This thesis outlines a method that incorporates the use of a contrastive variational autoencoder with an integrated classifier on multimodal sensor data to create a compressed latent space representation that can be used in future myoelectric control schemes

    Cognitive vision system for control of dexterous prosthetic hands: Experimental evaluation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Dexterous prosthetic hands that were developed recently, such as SmartHand and i-LIMB, are highly sophisticated; they have individually controllable fingers and the thumb that is able to abduct/adduct. This flexibility allows implementation of many different grasping strategies, but also requires new control algorithms that can exploit the many degrees of freedom available. The current study presents and tests the operation of a new control method for dexterous prosthetic hands.</p> <p>Methods</p> <p>The central component of the proposed method is an autonomous controller comprising a vision system with rule-based reasoning mounted on a dexterous hand (CyberHand). The controller, termed cognitive vision system (CVS), mimics biological control and generates commands for prehension. The CVS was integrated into a hierarchical control structure: 1) the user triggers the system and controls the orientation of the hand; 2) a high-level controller automatically selects the grasp type and size; and 3) an embedded hand controller implements the selected grasp using closed-loop position/force control. The operation of the control system was tested in 13 healthy subjects who used Cyberhand, attached to the forearm, to grasp and transport 18 objects placed at two different distances.</p> <p>Results</p> <p>The system correctly estimated grasp type and size (nine commands in total) in about 84% of the trials. In an additional 6% of the trials, the grasp type and/or size were different from the optimal ones, but they were still good enough for the grasp to be successful. If the control task was simplified by decreasing the number of possible commands, the classification accuracy increased (e.g., 93% for guessing the grasp type only).</p> <p>Conclusions</p> <p>The original outcome of this research is a novel controller empowered by vision and reasoning and capable of high-level analysis (i.e., determining object properties) and autonomous decision making (i.e., selecting the grasp type and size). The automatic control eases the burden from the user and, as a result, the user can concentrate on what he/she does, not on how he/she should do it. The tests showed that the performance of the controller was satisfactory and that the users were able to operate the system with minimal prior training.</p

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Research on Hand Action Pattern Recognition of Bionic Limb Based on Surface Electromyography

    Get PDF
    Hands are important parts of a human body. It is not only the main tool for people to engage in productive labor, but also an important communication tool. When the hand moves, the human body produces a kind of signal named surface electromyography (sEMG), which is a kind of electrophysiological signal that accompanies muscle activity. It contains a lot of information about human movement consciousness. The bionic limb is driven by multi-degree-freedom control, which is got by converting the recognition result and this can meet the urgent need of people with disabilities for autonomous operation. A profound study of hand action pattern technology based on sEMG signals can achieve the ability of the bionic limb to distinguish the hand action fast and accurately. From the perspective of the pattern recognition of the bionic limb, this paper discussed the human hand action pattern recognition technology of sEMG. By analyzing and summarizing the current development of human hand movement recognition, the author proposed a bionic limb schema based on artificial neural network and the improved DT-SVM hand action recognition system. According to the research results, it is necessary to expand the type and total amount of hand movements and gesture recognition, in order to adapt to the objective requirements of the diversity of hand action patterns in the application of the bionic limb

    MODIFICATION AND EVALUATION OF A BRAIN COMPUTER INTERFACE SYSTEM TO DETECT MOTOR INTENTION

    Get PDF
    It is widely understood that neurons within the brain produce electrical activity, and electroencephalography—a technique used to measure biopotentials with electrodes placed upon the scalp—has been used to observe it. Today, scientists and engineers work to interface these electrical neural signals with computers and machines through the field of Brain-Computer Interfacing (BCI). BCI systems have the potential to greatly improve the quality of life of physically handicapped individuals by replacing or assisting missing or debilitated motor functions. This research thus aims to further improve the efficacy of the BCI based assistive technologies used to aid physically disabled individuals. This study deals with the testing and modification of a BCI system that uses the alpha and beta bands to detect motor intention by weighing online EEG output against a calibrated threshold
    corecore