696 research outputs found

    Assessment of a Wearable Force- and Electromyography Device and Comparison of the Related Signals for Myocontrol

    Get PDF
    In the frame of assistive robotics, multi-finger prosthetic hand/wrists have recently appeared,offering an increasing level of dexterity; however, in practice their control is limited to a few handgrips and still unreliable, with the effect that pattern recognition has not yet appeared in the clinicalenvironment. According to the scientific community, one of the keys to improve the situation ismulti-modal sensing, i.e., using diverse sensor modalities to interpret the subject’s intent andimprove the reliability and safety of the control system in daily life activities. In this work, wefirst describe and test a novel wireless, wearable force- and electromyography device; throughan experiment conducted on ten intact subjects, we then compare the obtained signals bothqualitatively and quantitatively, highlighting their advantages and disadvantages. Our resultsindicate that force-myography yields signals which are more stable across time during whenevera pattern is held, than those obtained by electromyography. We speculate that fusion of the twomodalities might be advantageous to improve the reliability of myocontrol in the near future

    ViT-MDHGR: Cross-day Reliability and Agility in Dynamic Hand Gesture Prediction via HD-sEMG Signal Decoding

    Full text link
    Surface electromyography (sEMG) and high-density sEMG (HD-sEMG) biosignals have been extensively investigated for myoelectric control of prosthetic devices, neurorobotics, and more recently human-computer interfaces because of their capability for hand gesture recognition/prediction in a wearable and non-invasive manner. High intraday (same-day) performance has been reported. However, the interday performance (separating training and testing days) is substantially degraded due to the poor generalizability of conventional approaches over time, hindering the application of such techniques in real-life practices. There are limited recent studies on the feasibility of multi-day hand gesture recognition. The existing studies face a major challenge: the need for long sEMG epochs makes the corresponding neural interfaces impractical due to the induced delay in myoelectric control. This paper proposes a compact ViT-based network for multi-day dynamic hand gesture prediction. We tackle the main challenge as the proposed model only relies on very short HD-sEMG signal windows (i.e., 50 ms, accounting for only one-sixth of the convention for real-time myoelectric implementation), boosting agility and responsiveness. Our proposed model can predict 11 dynamic gestures for 20 subjects with an average accuracy of over 71% on the testing day, 3-25 days after training. Moreover, when calibrated on just a small portion of data from the testing day, the proposed model can achieve over 92% accuracy by retraining less than 10% of the parameters for computational efficiency

    A Review of Non-Invasive Haptic Feedback stimulation Techniques for Upper Extremity Prostheses

    Get PDF
    A sense of touch is essential for amputees to reintegrate into their social and work life. The design of the next generation of the prostheses will have the ability to effectively convey the tactile information between the amputee and the artificial limbs. This work reviews non-invasive haptic feedback stimulation techniques to convey the tactile information from the prosthetic hand to the amputee’s brain. Various types of actuators that been used to stimulate the patient’s residual limb for different types of artificial prostheses in previous studies have been reviewed in terms of functionality, effectiveness, wearability and comfort. The non-invasive hybrid feedback stimulation system was found to be better in terms of the stimulus identification rate of the haptic prostheses’ users. It can be conclude that integrating hybrid haptic feedback stimulation system with the upper limb prostheses leads to improving its acceptance among users

    A transferable adaptive domain adversarial neural network for virtual reality augmented EMG-Based gesture recognition

    Get PDF
    Within the field of electromyography-based (EMG) gesture recognition, disparities exist between the off line accuracy reported in the literature and the real-time usability of a classifier. This gap mainly stems from two factors: 1) The absence of a controller, making the data collected dissimilar to actual control. 2) The difficulty of including the four main dynamic factors (gesture intensity, limb position, electrode shift, and transient changes in the signal), as including their permutations drastically increases the amount of data to be recorded. Contrarily, online datasets are limited to the exact EMG-based controller used to record them, necessitating the recording of a new dataset for each control method or variant to be tested. Consequently, this paper proposes a new type of dataset to serve as an intermediate between off line and online datasets, by recording the data using a real-time experimental protocol. The protocol, performed in virtual reality, includes the four main dynamic factors and uses an EMG-independent controller to guide movements. This EMG-independent feedback ensures that the user is in-the-loop during recording, while enabling the resulting dynamic dataset to be used as an EMG-based benchmark. The dataset is comprised of 20 able-bodied participants completing three to four sessions over a period of 14 to 21 days. The ability of the dynamic dataset to serve as a benchmark is leveraged to evaluate the impact of different-recalibration techniques for long-term (across-day) gesture recognition, including a novel algorithm, named TADANN. TADANN consistently and significantly (p <; 0.05) outperforms using fine-tuning as the recalibration technique
    • …
    corecore