98 research outputs found

    Real-Time Hand Gesture Recognition Using Temporal Muscle Activation Maps of Multi-Channel sEMG Signals

    Full text link
    Accurate and real-time hand gesture recognition is essential for controlling advanced hand prostheses. Surface Electromyography (sEMG) signals obtained from the forearm are widely used for this purpose. Here, we introduce a novel hand gesture representation called Temporal Muscle Activation (TMA) maps which captures information about the activation patterns of muscles in the forearm. Based on these maps, we propose an algorithm that can recognize hand gestures in real-time using a Convolution Neural Network. The algorithm was tested on 8 healthy subjects with sEMG signals acquired from 8 electrodes placed along the circumference of the forearm. The average classification accuracy of the proposed method was 94%, which is comparable to state-of-the-art methods. The average computation time of a prediction was 5.5ms, making the algorithm ideal for the real-time gesture recognition applications.Comment: Paper accepted to IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 202

    Review on EMG Acquisition and Classification Techniques: Towards Zero Retraining in the Influence of User and Arm Position Independence

    Get PDF
    The surface electromyogram (EMG) is widely studied and applied in machine control. Recent methods of classifying hand gestures reported classification rates of over 95%. However, the majority of the studies made were performed on a single user, focusing solely on the gesture classification. These studies are restrictive in practical sense: either focusing on just gestures, multi-user compatibility, or rotation independence. The variations in EMG signals due to these conditions present a challenge to the practical application of EMG devices, often requiring repetitious training per application. To the best of our knowledge, there is little comprehensive review of works done in EMG classification in the combined influence of user-independence, rotation and hand exchange. Therefore, in this paper we present a review of works related to the practical issues of EMG with a focus on the EMG placement, and recent acquisition and computing techniques to reduce training. First, we provided an overview of existing electrode placement schemes. Secondly, we compared the techniques and results of single-subject against multi-subject, multi-position settings. As a conclusion, the study of EMG classification in this direction is relatively new. However the results are encouraging and strongly indicate that EMG classification in a broad range of people and tolerance towards arm orientation is possible, and can pave way for more flexible EMG devices

    Novel Bidirectional Body - Machine Interface to Control Upper Limb Prosthesis

    Get PDF
    Objective. The journey of a bionic prosthetic user is characterized by the opportunities and limitations involved in adopting a device (the prosthesis) that should enable activities of daily living (ADL). Within this context, experiencing a bionic hand as a functional (and, possibly, embodied) limb constitutes the premise for mitigating the risk of its abandonment through the continuous use of the device. To achieve such a result, different aspects must be considered for making the artificial limb an effective support for carrying out ADLs. Among them, intuitive and robust control is fundamental to improving amputees’ quality of life using upper limb prostheses. Still, as artificial proprioception is essential to perceive the prosthesis movement without constant visual attention, a good control framework may not be enough to restore practical functionality to the limb. To overcome this, bidirectional communication between the user and the prosthesis has been recently introduced and is a requirement of utmost importance in developing prosthetic hands. Indeed, closing the control loop between the user and a prosthesis by providing artificial sensory feedback is a fundamental step towards the complete restoration of the lost sensory-motor functions. Within my PhD work, I proposed the development of a more controllable and sensitive human-like hand prosthesis, i.e., the Hannes prosthetic hand, to improve its usability and effectiveness. Approach. To achieve the objectives of this thesis work, I developed a modular and scalable software and firmware architecture to control the Hannes prosthetic multi-Degree of Freedom (DoF) system and to fit all users’ needs (hand aperture, wrist rotation, and wrist flexion in different combinations). On top of this, I developed several Pattern Recognition (PR) algorithms to translate electromyographic (EMG) activity into complex movements. However, stability and repeatability were still unmet requirements in multi-DoF upper limb systems; hence, I started by investigating different strategies to produce a more robust control. To do this, EMG signals were collected from trans-radial amputees using an array of up to six sensors placed over the skin. Secondly, I developed a vibrotactile system to implement haptic feedback to restore proprioception and create a bidirectional connection between the user and the prosthesis. Similarly, I implemented an object stiffness detection to restore tactile sensation able to connect the user with the external word. This closed-loop control between EMG and vibration feedback is essential to implementing a Bidirectional Body - Machine Interface to impact amputees’ daily life strongly. For each of these three activities: (i) implementation of robust pattern recognition control algorithms, (ii) restoration of proprioception, and (iii) restoration of the feeling of the grasped object's stiffness, I performed a study where data from healthy subjects and amputees was collected, in order to demonstrate the efficacy and usability of my implementations. In each study, I evaluated both the algorithms and the subjects’ ability to use the prosthesis by means of the F1Score parameter (offline) and the Target Achievement Control test-TAC (online). With this test, I analyzed the error rate, path efficiency, and time efficiency in completing different tasks. Main results. Among the several tested methods for Pattern Recognition, the Non-Linear Logistic Regression (NLR) resulted to be the best algorithm in terms of F1Score (99%, robustness), whereas the minimum number of electrodes needed for its functioning was determined to be 4 in the conducted offline analyses. Further, I demonstrated that its low computational burden allowed its implementation and integration on a microcontroller running at a sampling frequency of 300Hz (efficiency). Finally, the online implementation allowed the subject to simultaneously control the Hannes prosthesis DoFs, in a bioinspired and human-like way. In addition, I performed further tests with the same NLR-based control by endowing it with closed-loop proprioceptive feedback. In this scenario, the results achieved during the TAC test obtained an error rate of 15% and a path efficiency of 60% in experiments where no sources of information were available (no visual and no audio feedback). Such results demonstrated an improvement in the controllability of the system with an impact on user experience. Significance. The obtained results confirmed the hypothesis of improving robustness and efficiency of a prosthetic control thanks to of the implemented closed-loop approach. The bidirectional communication between the user and the prosthesis is capable to restore the loss of sensory functionality, with promising implications on direct translation in the clinical practice

    Gaussian process autoregression for simultaneous proportional multi-modal prosthetic control with natural hand kinematics

    Get PDF
    Matching the dexterity, versatility, and robustness of the human hand is still an unachieved goal in bionics, robotics, and neural engineering. A major limitation for hand prosthetics lies in the challenges of reliably decoding user intention from muscle signals when controlling complex robotic hands. Most of the commercially available prosthetic hands use muscle-related signals to decode a finite number of predefined motions and some offer proportional control of open/close movements of the whole hand. Here, in contrast, we aim to offer users flexible control of individual joints of their artificial hand. We propose a novel framework for decoding neural information that enables a user to independently control 11 joints of the hand in a continuous manner-much like we control our natural hands. Toward this end, we instructed six able-bodied subjects to perform everyday object manipulation tasks combining both dynamic, free movements (e.g., grasping) and isometric force tasks (e.g., squeezing). We recorded the electromyographic and mechanomyographic activities of five extrinsic muscles of the hand in the forearm, while simultaneously monitoring 11 joints of hand and fingers using a sensorized data glove that tracked the joints of the hand. Instead of learning just a direct mapping from current muscle activity to intended hand movement, we formulated a novel autoregressive approach that combines the context of previous hand movements with instantaneous muscle activity to predict future hand movements. Specifically, we evaluated a linear vector autoregressive moving average model with exogenous inputs and a novel Gaussian process (gP) autoregressive framework to learn the continuous mapping from hand joint dynamics and muscle activity to decode intended hand movement. Our gP approach achieves high levels of performance (RMSE of 8°/s and ρ = 0.79). Crucially, we use a small set of sensors that allows us to control a larger set of independently actuated degrees of freedom of a hand. This novel undersensored control is enabled through the combination of nonlinear autoregressive continuous mapping between muscle activity and joint angles. The system evaluates the muscle signals in the context of previous natural hand movements. This enables us to resolve ambiguities in situations, where muscle signals alone cannot determine the correct action as we evaluate the muscle signals in their context of natural hand movements. gP autoregression is a particularly powerful approach which makes not only a prediction based on the context but also represents the associated uncertainty of its predictions, thus enabling the novel notion of risk-based control in neuroprosthetics. Our results suggest that gP autoregressive approaches with exogenous inputs lend themselves for natural, intuitive, and continuous control in neurotechnology, with the particular focus on prosthetic restoration of natural limb function, where high dexterity is required for complex movements

    Dynamic Fusion of Electromyographic and Electroencephalographic Data towards Use in Robotic Prosthesis Control

    Get PDF
    We demonstrate improved performance in the classification of bioelectric data for use in systems such as robotic prosthesis control, by data fusion using low-cost electromyography (EMG) and electroencephalography (EEG) devices. Prosthetic limbs are typically controlled through EMG, and whilst there is a wealth of research into the use of EEG as part of a brain-computer interface (BCI) the cost of EEG equipment commonly prevents this approach from being adopted outside the lab. This study demonstrates as a proof-of-concept that multimodal classification can be achieved by using low-cost EMG and EEG devices in tandem, with statistical decision-level fusion, to a high degree of accuracy. We present multiple fusion methods, including those based on Jensen-Shannon divergence which had not previously been applied to this problem. We report accuracies of up to 99% when merging both signal modalities, improving on the best-case single-mode classification. We hence demonstrate the strengths of combining EMG and EEG in a multimodal classification system that could in future be leveraged as an alternative control mechanism for robotic prostheses

    Sistema de miografia óptica para reconhecimento de gestos e posturas de mão

    Get PDF
    Orientador: Éric FujiwaraDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia MecânicaResumo: Nesse projeto, demonstrou-se um sistema de miografia óptica como uma alternativa promissora para monitorar as posturas da mão e os gestos do usuário. Essa técnica se fundamenta em acompanhar as atividades musculares responsáveis pelos movimentos da mão com uma câmera externa, relacionando a distorção visual verificada no antebraço com a contração e o relaxamento necessários para dada postura. Três configurações de sensores foram propostas, estudadas e avaliadas. A primeira propôs monitorar a atividade muscular analisando a variação da frequência espacial de uma textura de listras uniformes impressa sobre a pele, enquanto que a segunda se caracteriza pela contagem de pixels de pele visível dentro da região de interesse. Ambas as configurações se mostraram inviáveis pela baixa robustez e alta demanda por condições experimentais controladas. Por fim, a terceira recupera o estado da mão acompanhando o deslocamento de uma série de marcadores coloridos distribuídos ao longo do antebraço. Com um webcam de 24 fps e 640 × 480 pixels, essa última configuração foi validada para oito posturas distintas, explorando principalmente a flexão e extensão dos dedos e do polegar, além da adução e abdução do último. Os dados experimentais, adquiridos off-line, são submetidos a uma rotina de processamento de imagens para extrair a informação espacial e de cor dos marcadores em cada quadro, dados esses utilizados para rastrear os mesmos marcadores ao longo de todos os quadros. Para reduzir a influência das vibrações naturais e inerentes ao corpo humano, um sistema de referencial local é ainda adotado dentro da própria região de interesse. Finalmente, os dados quadro a quadro com o ground truth são alimentados a uma rede neural artificial sequencial, responsável pela calibração supervisionada do sensor e posterior classificação das posturas. O desempenho do sistema para a classificação das oito posturas foi avaliado com base na validação cruzada com 10-folds, com a câmera monitorando o antebraço pela superfície interna ou externa. O sensor apresentou uma precisão de ?92.4% e exatidão de ?97.9% para o primeiro caso, e uma precisão de ?75.1% e exatidão de ?92.5% para o segundo, sendo comparável a outras técnicas de miografia, demonstrando a viabilidade do projeto e abrindo perspectivas para aplicações em interfaces humano-robôAbstract: In this work, an optical myography system is demonstrated as a promising alternative to monitor hand posture and gestures of the user. This technique is based on accompanying muscular activities responsible for hand motion with an external camera, and relating the visual deformation observed on the forearm to the muscular contractions/relaxations for a given posture. Three sensor designs were proposed, studied and evaluated. The first one intended to monitor muscular activity by analyzing the spatial frequency variation of a uniformly distributed stripe pattern stamped on the skin, whereas the second one is characterized by reckoning visible skin pixels inside the region of interest. Both designs are impracticable due to their low robustness and high demand for controlled experimental conditions. At last, the third design retrieves hand configuration by tracking visually the displacements of a series of color markers distributed over the forearm. With a webcam of 24 fps and 640 × 480 pixels, this design was validated for eight different postures, exploring fingers and thumb flexion/extension, plus thumb adduction/abduction. The experimental data are acquired offline and, then, submitted to an image processing routine to extract color and spatial information of the markers in each frame; the extracted data is subsequently used to track the same markers along all frames. To reduce the influence of human body natural and inherent vibrations, a local reference frame is yet adopted in the region of interest. Finally, the frame by frame data, along with the ground truth posture, are fed into a sequential artificial neural network, responsible for sensor supervised calibration and subsequent posture classification. The system performance was evaluated in terms of eight postures classification via 10-fold cross-validation, with the camera monitoring either the underside or the back of the forearm. The sensor presented a ?92.4% precision and ?97.9% accuracy for the former, and a ?75.1% precision and ?92.5% accuracy for the latter, being thus comparable to other myographic techniques; it also demonstrated that the project is feasible and offers prospects for human-robot interaction applicationsMestradoEngenharia MecanicaMestre em Engenharia Mecânica33003017CAPE

    Küünarvarre visuaalne jälgimine ennustamaks sõrmede asendeid

    Get PDF
    Prosthetic robotics is one of the most rapidly developing fields of robotics and providing solutions to many people around the world. Hand amputees can greatly benefit from prosthetic arms and can perform many daily tasks that would not be true if not for prosthetic arm. Despite the availability of commercial arms in today’s world, the high cost of such products makes it unattainable for many people and the need for cost-effective solutions arises more and more. 3D printing technology has made it available to get a prosthetic arm at lower cost. However, one of the main challenges in this application is the reconstruction of the intended motion of the fingers. A new approach has been developed to enable for predicting the intended motion using just a camera and a combination of image processing and machine learning techniques. However, this setup implies a fixed position of the arm which is not practical. In this project, a more robust setup is designed and tested to enable for the free motion of the arm as a proof of concept. Instead of using the AR tags coordinates relative to the camera frame, the transformation between each tag relative to other tags is used. LDA, Decision Trees and SVM are used for classification and their performance is compared

    Electrocutaneous stimulation to close the loop in myoelectric prosthesis control

    Get PDF
    Current commercially available prosthetic systems still lack sensory feedback and amputees are forced to maintain eye-contact with the prosthesis when interacting with their environment. Electrocutaneous stimulation is a promising approach to convey sensory feedback via the skin. However, when discussed in the context of prosthetic applications, it is often refused due to its supposed incompatibility with myocontrol. This dissertation now addresses electrocutaneous stimulation as means to provide sensory feedback to prosthesis users, and its implications on myoelectric control, possible use for improved or accelerated mastering of prosthesis control through closing of the control loop, as well as its potential in aiding in the embodiment of prosthetic components. First, a comparison of different paradigms for encoding sensory feedback variables in electrocutaneous stimulation patterns was done. For this, subject ability to employ spatially and intensity-coded electrocutaneous feedback in a simulated closed-loop control task was evaluated. The task was to stabilise an invisible virtual inverted pendulum under ideal feedforward control conditions (joystick). Pendulum inclination was either presented spatially (12 stimulation sites), encoded by stimulation strength (≧ 2 stimulation sites), or a combination of the two. The tests indicated that spatial encoding was perceived as more intuitive, but intensity encoding yielded better performance and lower energy expenditure. The second study investigated the detrimental influence of stimulation artefacts on myoelectric control of prostheses for a wide range of stimulation parameters and two prosthesis control approaches (pattern recognition of eight motion primitives, direct proportional control). Artefact blanking is introduced and discussed as a practical approach to handle stimulation artefacts and restore control performance back to the baseline. This was shown with virtual and applied artefact blanking (pattern recognition on six electromyographic channels), as well as in a practical task-related test with a real prosthesis (proportional control). The information transfer of sensory feedback necessary to master a routine grasping task using electromyographic control of a prosthesis was investigated in another study. Subjects controlled a real prosthesis to repeatedly grasp a dummy object, which implemented two different objects with previously unknown slip and fragility properties. Three feedback conditions (basic feedback on grasp success, visual grasp force feedback, tactile grasp force feedback) were compared with regard to their influence on subjects’ task performance and variability in exerted grasp force. It was found that online force feedback via a visual or tactile channel did not add significant advantages, and that basic feedback was sufficient and was employed by subjects to improve both performance and force variability with time. Importantly, there was no adverse effect of the additional feedback, either. This has important implications for other non-functional applications of sensory feedback, such as facilitation of embodiment of prosthetic devices. The final study investigated the impact of electrocutaneous stimulation on embodiment of an artificial limb. For this purpose, a sensor finger was employed in a rubber-hand-illusion-like experiment. Two independent groups (test, control), were compared with regard to two objective measures of embodiment: proprioceptive drift, and change in skin temperature. Though proprioceptive drift measures did not reveal differences between conditions, they indicated trends generally associated to a successful illusion. Additionally, significant changes in skin temperature between test and control group indicated that embodiment of the artificial digit could be induced by providing sensory substitution feedback on the forearm. In conclusion, it has been shown that humans can employ electrocutaneous stimulation feedback in challenging closed-loop control tasks. It was found that transition from simple intuitive encodings (spatial) to those providing better resolution (intensity) further improves feedback exploitation. Blanking and segmentation approaches facilitate simultaneous application of electrocutaneous stimulation and electromyographic control of prostheses, using both pattern recognition and classic proportional approaches. While it was found that force feedback may not aid in the mastering of routine grasping, the presence of the feedback was also found to not impede the user performance. This is an important implication for the application of feedback for non-functional purposes, such as facilitation of embodiment. Regarding this, it was shown that providing sensory feedback via electrocutaneous stimulation did indeed promote embodiment of an artificial finger, even if the feedback was applied to the forearm. Based on the results of this work, the next step should be integration of sensory feedback into commercial devices, so that all amputees can benefit from its advantages. Electrocutaneous stimulation has been shown to be an ideal means for realising this. Hitherto existing concerns about the compatibility of electrocutaneous stimulation and myocontrol could be resolved by presenting appropriate methods to deal with stimulation artefacts
    corecore