1,093 research outputs found
An EMG Gesture Recognition System with Flexible High-Density Sensors and Brain-Inspired High-Dimensional Classifier
EMG-based gesture recognition shows promise for human-machine interaction.
Systems are often afflicted by signal and electrode variability which degrades
performance over time. We present an end-to-end system combating this
variability using a large-area, high-density sensor array and a robust
classification algorithm. EMG electrodes are fabricated on a flexible substrate
and interfaced to a custom wireless device for 64-channel signal acquisition
and streaming. We use brain-inspired high-dimensional (HD) computing for
processing EMG features in one-shot learning. The HD algorithm is tolerant to
noise and electrode misplacement and can quickly learn from few gestures
without gradient descent or back-propagation. We achieve an average
classification accuracy of 96.64% for five gestures, with only 7% degradation
when training and testing across different days. Our system maintains this
accuracy when trained with only three trials of gestures; it also demonstrates
comparable accuracy with the state-of-the-art when trained with one trial
Methods for enhanced learning using wearable technologies. A study of the maritime sector
Maritime safety is a critical concern due to the potential for serious consequences or accidents for the crew, passengers, environment, and assets resulting from navigation errors or unsafe acts. Traditional training methods face challenges in the rapidly evolving maritime industry, and innovative training methods are being explored. This study explores the use of wearable sensors with biosignal data collection to improve training performance in the maritime sector. Three experiments were conducted progressively to investigate the relationship between navigators' experience levels and biosignal data results, the effects of different training methods on cognitive workload, trainees' stress levels, and their decision-making skills, and the classification of scenario complexity and the biosignal data obtained by the trainees. questionnaire data on stress levels, workload, and user satisfaction of auxiliary training equipment; performance evaluation data on navigational abilities, decision-making skills, and ship-handling abilities; and biosignal data, including electrodermal activity (EDA), body temperature, blood volume pulse (BVP), inter-beat interval (IBI), and heart rate (HR). Several statistical methods and machine-learning algorithms were used in the data analysis.
The present dissertation contributes to the advancement of the field of maritime education and training by exploring methods for enhancing learning in complex situations. The use of biosignal data provides insights into the interplay between stress levels and training outcomes in the maritime industry. The proposed conceptual training model underscores the relationship between trainees' stress and safety factors and offers a framework for the development and evaluation of advanced biosignal data-based training systems
Affective Man-Machine Interface: Unveiling human emotions through biosignals
As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals
- ā¦