Estimating finger joint angles by surface EMG signal using feature extraction and transformer-based deep learning model

Abstract

Human-machine interfaces frequently use electromyography (EMG) signals. Based on previous work, feature extraction has a great deal of influence on the performance of EMG pattern recognition. Furthermore, the Deep Learning method is supposed to increase performance and not depend on feature engineering. However, directly processing raw signals will require a higher computation rate. This study proposed a new method that combines feature extraction and Deep Learning to address those issues while improving performance, reducing architecture size, and producing a more representative output. The proposed architecture employs the Transformer model as the backbone to get the correlation between elements and focus on the important information for estimating the flexion-extension of finger joint angles. This study uses experiment three of the NinaPro (Non-Invasive Adaptive Hand Prosthetics) DB5 dataset. Each experiment produces 16 Surface EMG data streams from two Myo Armbands devices representing 22 finger joint angles as output. This study compares the windowing process, feature extraction, execution time, and results with previous studies. The results show that the proposed model outperforms the previous study, from 0.957 in the previous study to become 0.970 in this study for the R-Square score. This result is obtained using 100 data points for the windowing process and Median Frequency for the best feature extraction method. © 202

Similar works

Full text

thumbnail-image

repository civitas UGM

redirect
Last time updated on 03/12/2023

This paper was published in repository civitas UGM.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.