14 research outputs found
Real-Time Hand Gesture Recognition Using Temporal Muscle Activation Maps of Multi-Channel sEMG Signals
Accurate and real-time hand gesture recognition is essential for controlling
advanced hand prostheses. Surface Electromyography (sEMG) signals obtained from
the forearm are widely used for this purpose. Here, we introduce a novel hand
gesture representation called Temporal Muscle Activation (TMA) maps which
captures information about the activation patterns of muscles in the forearm.
Based on these maps, we propose an algorithm that can recognize hand gestures
in real-time using a Convolution Neural Network. The algorithm was tested on 8
healthy subjects with sEMG signals acquired from 8 electrodes placed along the
circumference of the forearm. The average classification accuracy of the
proposed method was 94%, which is comparable to state-of-the-art methods. The
average computation time of a prediction was 5.5ms, making the algorithm ideal
for the real-time gesture recognition applications.Comment: Paper accepted to IEEE International Conference on Acoustics, Speech,
and Signal Processing (ICASSP) 202
Face Mediated Human-Robot Interaction for Remote Medical Examination: Associated Data and Code
Repository for experiment data and code
Estimation of Forearm Supination/Pronation Motion Based on EEG Signals to Control an Artificial Arm
Recommended from our members
Soft Control Interface for Highly Dexterous Unilateral Remote Palpation
Achieving telepresence is the biggest challenge preventing healthcare robotics to permeate the field of primary care and performing tasks such as remote palpation. To do so, the user interface is of fundamental importance: ideally, it would reproduce similar conditions to the in-person task, resulting in intuitive and natural. Many studies focused on the high fidelity of the haptic and tactile feedback, overlooking the importance of the control interface. On the other hand, state-of-the-art soft control interfaces showcase high telepresence but low dexterity, limiting the user's range of motion. In this paper, a soft control interface for highly dexterous remote palpation, up to 5 degrees of freedom, is presented. The aim of this work is to use morphological computation to intrinsically encode hand trajectories, showcasing 3 different designs to test and optimize the interface's embodied intelligence. Then, the performance of the proposed interface is analyzed as a controller of a unilateral teleoperated system. The results show that the device is able to correctly encode the hand's motion and reproduce complex trajectories up to 5 DoF. Compared with keyboard control, it replicates 2D trajectories in a similar fashion, while maintaining the intuitiveness and naturalness of palpation.This work was supported by the SMART project, European Union's Horizon 2020 research and innovation under the Marie Sklodowska-Curie (ID 860108), and EPSRC RoboPatient project EP/T00519X/1 and EP/T00603X/1 (Imperial College London)
Recommended from our members
Vocal pain expression augmentation for a robopatient
Peer reviewed: TrueAbdominal palpation is one of the basic but important physical examination methods used by physicians. Visual, auditory, and haptic feedback from the patients are known to be the main sources of feedback they use in the diagnosis. However, learning to interpret this feedback and making accurate diagnosis require several years of training. Many abdominal palpation training simulators have been proposed to date, but very limited attempts have been reported in integrating vocal pain expressions into physical abdominal palpation simulators. Here, we present a vocal pain expression augmentation for a robopatient. The proposed robopatient is capable of providing real-time facial and vocal pain expressions based on the exerted palpation force and position on the abdominal phantom of the robopatient. A pilot study is conducted to test the proposed system, and we show the potential of integrating vocal pain expressions to the robopatient. The platform has also been tested by two clinical experts with prior experience in abdominal palpation. Their evaluations on functionality and suggestions for improvements are presented. We highlight the advantages of the proposed robopatient with real-time vocal and facial pain expressions as a controllable simulator platform for abdominal palpation training studies. Finally, we discuss the limitations of the proposed approach and suggest several future directions for improvements
Adaptive Foot in Lower-Limb Prostheses
The human foot consists of complex sets of joints. The adaptive nature of the human foot enables it to be stable on any uneven surface. It is important to have such adaptive capabilities in the artificial prosthesis to achieve most of the essential movements for lower-limb amputees. However, many existing lower-limb prostheses lack the adaptive nature. This paper reviews lower-limb adaptive foot prostheses. In order to understand the design concepts of adaptive foot prostheses, the biomechanics of human foot have been explained. Additionally, the requirements and design challenges are investigated and presented. In this review, adaptive foot prostheses are classified according to actuation method. Furthermore, merits and demerits of present-day adaptive foot prostheses are presented based on the hardware construction. The hardware configurations of recent adaptive foot prostheses are analyzed and compared. At the end, potential future developments are highlighted
Differences in pain inference from animated facial expressions based on observers’ individual characteristics
Supporting materials for "Differences in pain inference from animated facial expressions based on observers’ individual characteristics