1,117 research outputs found

    An EMG Gesture Recognition System with Flexible High-Density Sensors and Brain-Inspired High-Dimensional Classifier

    Full text link
    EMG-based gesture recognition shows promise for human-machine interaction. Systems are often afflicted by signal and electrode variability which degrades performance over time. We present an end-to-end system combating this variability using a large-area, high-density sensor array and a robust classification algorithm. EMG electrodes are fabricated on a flexible substrate and interfaced to a custom wireless device for 64-channel signal acquisition and streaming. We use brain-inspired high-dimensional (HD) computing for processing EMG features in one-shot learning. The HD algorithm is tolerant to noise and electrode misplacement and can quickly learn from few gestures without gradient descent or back-propagation. We achieve an average classification accuracy of 96.64% for five gestures, with only 7% degradation when training and testing across different days. Our system maintains this accuracy when trained with only three trials of gestures; it also demonstrates comparable accuracy with the state-of-the-art when trained with one trial

    Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures

    No full text
    Mobile communication devices, such as mobile phones and networked personal digital assistants (PDAs), allow users to be constantly connected and communicate anywhere and at any time, often resulting in personal and private communication taking place in public spaces. This private -- public contrast can be problematic. As a remedy, we promote intimate interfaces: interfaces that allow subtle and minimal mobile interaction, without disruption of the surrounding environment. In particular, motionless gestures sensed through the electromyographic (EMG) signal have been proposed as a solution to allow subtle input in a mobile context. In this paper we present an expansion of the work on EMG-based motionless gestures including (1) a novel study of their usability in a mobile context for controlling a realistic, multimodal interface and (2) a formal assessment of how noticeable they are to informed observers. Experimental results confirm that subtle gestures can be profitably used within a multimodal interface and that it is difficult for observers to guess when someone is performing a gesture, confirming the hypothesis of subtlety

    Low-cost wearable multichannel surface EMG acquisition for prosthetic hand control

    Get PDF
    Prosthetic hand control based on the acquisition and processing of surface electromyography signals (sEMG) is a well-established method that makes use of the electric potentials evoked by the physiological contraction processes of one or more muscles. Furthermore intelligent mobile medical devices are on the brink of introducing safe and highly sophisticated systems to help a broad patient community to regain a considerable amount of life quality. The major challenges which are inherent in such integrated system’s design are mainly to be found in obtaining a compact system with a long mobile autonomy, capable of delivering the required signal requirements for EMG based prosthetic control with up to 32 simultaneous acquisition channels and – with an eye on a possible future exploitation as a medical device – a proper perspective on a low priced system. Therefore, according to these requirements we present a wireless, mobile platform for acquisition and communication of sEMG signals embedded into a complete mobile control system structure. This environment further includes a portable device such as a laptop providing the necessary computational power for the control and a commercially available robotic handprosthesis. Means of communication among those devices are based on the Bluetooth standard. We show, that the developed low cost mobile device can be used for proper prosthesis control and that the device can rely on a continuous operation for the usual daily life usage of a patient

    Hand Pattern Recognition Using Smart Band

    Get PDF
    The Importance of gesture recognition has widely spread around the world. Many research strategies have been proposed to study and recognize gestures, especially facial and hand gestures. Distinguishing and recognizing hand gestures is vital in hotspot fields such as bionic parts, powered exoskeleton, diagnosing muscle disorders, etc. Recognizing such gesture patterns can also create a stress-free and fancy user interface for mobile phones, gaming consoles and other such devices. The objective is to design a simple yet efficient wearable hand gesture recognizing system. This thesis also shows that by taking both EMG and accelerometer data into account, can improve the system to recognize more patterns with higher accuracy levels. For this, a hand band embedded with a triple axis accelerometer and three surface EMG electrodes is employed to source the system. The non-invasive surface EMG electrodes senses muscle action while the accelerometer senses the hand motions. The EMG signal is passed through analog front-end module for noise filtering and signal amplification. An ARM Cortex processor converts the analog EMG and accelerometer signal into digital and transmits to a PC via Bluetooth protocol. On the receiver section, the raw EMG and acceleration data is further processed and decomposed offline using MATLAB tools to extract features such as root mean square, waveform length, threshold crossing, variance and mean. Extracted features are then fed through multi-class SVM (Support Vector Machine) process for pattern recognition. The chapters below discuss in greater detail on pattern recognition technique and other modules involved

    Novel Muscle Monitoring by Radiomyography(RMG) and Application to Hand Gesture Recognition

    Full text link
    Conventional electromyography (EMG) measures the continuous neural activity during muscle contraction, but lacks explicit quantification of the actual contraction. Mechanomyography (MMG) and accelerometers only measure body surface motion, while ultrasound, CT-scan and MRI are restricted to in-clinic snapshots. Here we propose a novel radiomyography (RMG) for continuous muscle actuation sensing that can be wearable and touchless, capturing both superficial and deep muscle groups. We verified RMG experimentally by a forearm wearable sensor for detailed hand gesture recognition. We first converted the radio sensing outputs to the time-frequency spectrogram, and then employed the vision transformer (ViT) deep learning network as the classification model, which can recognize 23 gestures with an average accuracy up to 99% on 8 subjects. By transfer learning, high adaptivity to user difference and sensor variation were achieved at an average accuracy up to 97%. We further demonstrated RMG to monitor eye and leg muscles and achieved high accuracy for eye movement and body postures tracking. RMG can be used with synchronous EMG to derive stimulation-actuation waveforms for many future applications in kinesiology, physiotherapy, rehabilitation, and human-machine interface
    corecore