630 research outputs found

    A first approach to a taxonomy-based classification framework for hand grasps

    Get PDF
    Many solutions have been proposed to help amputated subjects regain the lost functionality. In order to interact with the outer world and objects that populate it, it is crucial for these subjects to being able to perform essential grasps. In this paper we propose a preliminary solution for the online classification of 8 basics hand grasps by considering physiological signals, namely Surface Electromyography (sEMG), exploiting a quantitative taxonomy of the considered movement. The hierarchical organization of the taxonomy allows a decomposition of the classification phase between couples of movement groups. The idea is that the closest to the roots the more hard is the classification, but on the meantime the miss-classification error is less problematic, since the two movements will be close to each other. The proposed solution is subject-independent, which means that signals from many different subjects are considered by the probabilistic framework to modelize the input signals. The information has been modeled offline by using a Gaussian Mixture Model (GMM), and then testen online on a unseen subject, by using a Gaussian-based classification. In order to be able to process the signal online, an accurate preprocessing phase is needed, in particular, we apply the Wavelet Transform (Wavelet Transform) to the Electromyography (EMG) signal. Thanks to this approach we are able to develop a robust and general solution, which can adapt quickly to new subjects, with no need of long and draining training phase. In this preliminary study we were able to reach a mean accuracy of 76.5%, reaching up to 97.29% in the higher levels

    Bio-signal based control in assistive robots: a survey

    Get PDF
    Recently, bio-signal based control has been gradually deployed in biomedical devices and assistive robots for improving the quality of life of disabled and elderly people, among which electromyography (EMG) and electroencephalography (EEG) bio-signals are being used widely. This paper reviews the deployment of these bio-signals in the state of art of control systems. The main aim of this paper is to describe the techniques used for (i) collecting EMG and EEG signals and diving these signals into segments (data acquisition and data segmentation stage), (ii) dividing the important data and removing redundant data from the EMG and EEG segments (feature extraction stage), and (iii) identifying categories from the relevant data obtained in the previous stage (classification stage). Furthermore, this paper presents a summary of applications controlled through these two bio-signals and some research challenges in the creation of these control systems. Finally, a brief conclusion is summarized

    Feature Analysis for Classification of Physical Actions using surface EMG Data

    Full text link
    Based on recent health statistics, there are several thousands of people with limb disability and gait disorders that require a medical assistance. A robot assisted rehabilitation therapy can help them recover and return to a normal life. In this scenario, a successful methodology is to use the EMG signal based information to control the support robotics. For this mechanism to function properly, the EMG signal from the muscles has to be sensed and then the biological motor intention has to be decoded and finally the resulting information has to be communicated to the controller of the robot. An accurate detection of the motor intention requires a pattern recognition based categorical identification. Hence in this paper, we propose an improved classification framework by identification of the relevant features that drive the pattern recognition algorithm. Major contributions include a set of modified spectral moment based features and another relevant inter-channel correlation feature that contribute to an improved classification performance. Next, we conducted a sensitivity analysis of the classification algorithm to different EMG channels. Finally, the classifier performance is compared to that of the other state-of the art algorithm

    CES-513 Stages for Developing Control Systems using EMG and EEG Signals: A survey

    Get PDF
    Bio-signals such as EMG (Electromyography), EEG (Electroencephalography), EOG (Electrooculogram), ECG (Electrocardiogram) have been deployed recently to develop control systems for improving the quality of life of disabled and elderly people. This technical report aims to review the current deployment of these state of the art control systems and explain some challenge issues. In particular, the stages for developing EMG and EEG based control systems are categorized, namely data acquisition, data segmentation, feature extraction, classification, and controller. Some related Bio-control applications are outlined. Finally a brief conclusion is summarized.

    Analysis of Human Gait Using Hybrid EEG-fNIRS-Based BCI System: A Review

    Get PDF
    Human gait is a complex activity that requires high coordination between the central nervous system, the limb, and the musculoskeletal system. More research is needed to understand the latter coordination\u27s complexity in designing better and more effective rehabilitation strategies for gait disorders. Electroencephalogram (EEG) and functional near-infrared spectroscopy (fNIRS) are among the most used technologies for monitoring brain activities due to portability, non-invasiveness, and relatively low cost compared to others. Fusing EEG and fNIRS is a well-known and established methodology proven to enhance brain–computer interface (BCI) performance in terms of classification accuracy, number of control commands, and response time. Although there has been significant research exploring hybrid BCI (hBCI) involving both EEG and fNIRS for different types of tasks and human activities, human gait remains still underinvestigated. In this article, we aim to shed light on the recent development in the analysis of human gait using a hybrid EEG-fNIRS-based BCI system. The current review has followed guidelines of preferred reporting items for systematic reviews and meta-Analyses (PRISMA) during the data collection and selection phase. In this review, we put a particular focus on the commonly used signal processing and machine learning algorithms, as well as survey the potential applications of gait analysis. We distill some of the critical findings of this survey as follows. First, hardware specifications and experimental paradigms should be carefully considered because of their direct impact on the quality of gait assessment. Second, since both modalities, EEG and fNIRS, are sensitive to motion artifacts, instrumental, and physiological noises, there is a quest for more robust and sophisticated signal processing algorithms. Third, hybrid temporal and spatial features, obtained by virtue of fusing EEG and fNIRS and associated with cortical activation, can help better identify the correlation between brain activation and gait. In conclusion, hBCI (EEG + fNIRS) system is not yet much explored for the lower limb due to its complexity compared to the higher limb. Existing BCI systems for gait monitoring tend to only focus on one modality. We foresee a vast potential in adopting hBCI in gait analysis. Imminent technical breakthroughs are expected using hybrid EEG-fNIRS-based BCI for gait to control assistive devices and Monitor neuro-plasticity in neuro-rehabilitation. However, although those hybrid systems perform well in a controlled experimental environment when it comes to adopting them as a certified medical device in real-life clinical applications, there is still a long way to go

    Fast human motion prediction for human-robot collaboration with wearable interfaces

    Full text link
    In this paper, we aim at improving human motion prediction during human-robot collaboration in industrial facilities by exploiting contributions from both physical and physiological signals. Improved human-machine collaboration could prove useful in several areas, while it is crucial for interacting robots to understand human movement as soon as possible to avoid accidents and injuries. In this perspective, we propose a novel human-robot interface capable to anticipate the user intention while performing reaching movements on a working bench in order to plan the action of a collaborative robot. The proposed interface can find many applications in the Industry 4.0 framework, where autonomous and collaborative robots will be an essential part of innovative facilities. A motion intention prediction and a motion direction prediction levels have been developed to improve detection speed and accuracy. A Gaussian Mixture Model (GMM) has been trained with IMU and EMG data following an evidence accumulation approach to predict reaching direction. Novel dynamic stopping criteria have been proposed to flexibly adjust the trade-off between early anticipation and accuracy according to the application. The output of the two predictors has been used as external inputs to a Finite State Machine (FSM) to control the behaviour of a physical robot according to user's action or inaction. Results show that our system outperforms previous methods, achieving a real-time classification accuracy of 94.3±2.9%94.3\pm2.9\% after 160.0msec±80.0msec160.0msec\pm80.0msec from movement onset

    BCI controlled robotic arm as assistance to the rehabilitation of neurologically disabled patients

    Get PDF
    This presentation summarises the development of a portable and cost-efficient BCI controlled assistive technology using a non-invasive BCI headset 'OpenBCI' and an open source robotic arm, U-Arm, to accomplish tasks related to rehabilitation, such as access to resources, adaptability or home use. The resulting system used a combination of EEG and EMG sensor readings to control the arm, which could perform a number of different tasks such as picking/placing objects or assist users in eating

    Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction:a review

    Get PDF
    Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application
    • 

    corecore