499 research outputs found

    Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study

    Get PDF
    Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand

    Upper Limb Movement Recognition utilising EEG and EMG Signals for Rehabilitative Robotics

    Full text link
    Upper limb movement classification, which maps input signals to the target activities, is a key building block in the control of rehabilitative robotics. Classifiers are trained for the rehabilitative system to comprehend the desires of the patient whose upper limbs do not function properly. Electromyography (EMG) signals and Electroencephalography (EEG) signals are used widely for upper limb movement classification. By analysing the classification results of the real-time EEG and EMG signals, the system can understand the intention of the user and predict the events that one would like to carry out. Accordingly, it will provide external help to the user. However, the noise in the real-time EEG and EMG data collection process contaminates the effectiveness of the data, which undermines classification performance. Moreover, not all patients process strong EMG signals due to muscle damage and neuromuscular disorder. To address these issues, this paper explores different feature extraction techniques and machine learning and deep learning models for EEG and EMG signals classification and proposes a novel decision-level multisensor fusion technique to integrate EEG signals with EMG signals. This system retrieves effective information from both sources to understand and predict the desire of the user, and thus aid. By testing out the proposed technique on a publicly available WAY-EEG-GAL dataset, which contains EEG and EMG signals that were recorded simultaneously, we manage to conclude the feasibility and effectiveness of the novel system.Comment: 20 pages, 11 figures, 2 tables; Thesis for Undergraduate Research Project in Computing, NUS; Accepted by Future of Information and Communication Conference 2023, San Francisc

    Intelligent Biosignal Processing in Wearable and Implantable Sensors

    Get PDF
    This reprint provides a collection of papers illustrating the state-of-the-art of smart processing of data coming from wearable, implantable or portable sensors. Each paper presents the design, databases used, methodological background, obtained results, and their interpretation for biomedical applications. Revealing examples are brain–machine interfaces for medical rehabilitation, the evaluation of sympathetic nerve activity, a novel automated diagnostic tool based on ECG data to diagnose COVID-19, machine learning-based hypertension risk assessment by means of photoplethysmography and electrocardiography signals, Parkinsonian gait assessment using machine learning tools, thorough analysis of compressive sensing of ECG signals, development of a nanotechnology application for decoding vagus-nerve activity, detection of liver dysfunction using a wearable electronic nose system, prosthetic hand control using surface electromyography, epileptic seizure detection using a CNN, and premature ventricular contraction detection using deep metric learning. Thus, this reprint presents significant clinical applications as well as valuable new research issues, providing current illustrations of this new field of research by addressing the promises, challenges, and hurdles associated with the synergy of biosignal processing and AI through 16 different pertinent studies. Covering a wide range of research and application areas, this book is an excellent resource for researchers, physicians, academics, and PhD or master students working on (bio)signal and image processing, AI, biomaterials, biomechanics, and biotechnology with applications in medicine

    Implementing physiologically-based approaches to improve Brain-Computer Interfaces usability in post-stroke motor rehabilitation

    Get PDF
    Stroke is one of the leading causes of long-term motor disability and, as such, directly impacts on daily living activities. Identifying new strategies to recover motor function is a central goal of clinical research. In the last years the approach to the post-stroke function restore has moved from the physical rehabilitation to the evidence-based neurological rehabilitation. Brain-Computer Interface (BCI) technology offers the possibility to detect, monitor and eventually modulate brain activity. The potential of guiding altered brain activity back to a physiological condition through BCI and the assumption that this recovery of brain activity leads to the restoration of behaviour is the key element for the use of BCI systems for therapeutic purposes. To bridge the gap between research-oriented methodology in BCI design and the usability of a system in the clinical realm requires efforts towards BCI signal processing procedures that would optimize the balance between system accuracy and usability. The thesis focused on this issue and aimed to propose new algorithms and signal processing procedures that, by combining physiological and engineering approaches, would provide the basis for designing more usable BCI systems to support post-stroke motor recovery. Results showed that introduce new physiologically-driven approaches to the pre-processing of BCI data, methods to support professional end-users in the BCI control parameter selection according to evidence-based rehabilitation principles and algorithms for the parameter adaptation in time make the BCI technology more affordable, more efficient, and more usable and, therefore, transferable to the clinical realm

    A brain-machine interface for assistive robotic control

    Get PDF
    Brain-machine interfaces (BMIs) are the only currently viable means of communication for many individuals suffering from locked-in syndrome (LIS) – profound paralysis that results in severely limited or total loss of voluntary motor control. By inferring user intent from task-modulated neurological signals and then translating those intentions into actions, BMIs can enable LIS patients increased autonomy. Significant effort has been devoted to developing BMIs over the last three decades, but only recently have the combined advances in hardware, software, and methodology provided a setting to realize the translation of this research from the lab into practical, real-world applications. Non-invasive methods, such as those based on the electroencephalogram (EEG), offer the only feasible solution for practical use at the moment, but suffer from limited communication rates and susceptibility to environmental noise. Maximization of the efficacy of each decoded intention, therefore, is critical. This thesis addresses the challenge of implementing a BMI intended for practical use with a focus on an autonomous assistive robot application. First an adaptive EEG- based BMI strategy is developed that relies upon code-modulated visual evoked potentials (c-VEPs) to infer user intent. As voluntary gaze control is typically not available to LIS patients, c-VEP decoding methods under both gaze-dependent and gaze- independent scenarios are explored. Adaptive decoding strategies in both offline and online task conditions are evaluated, and a novel approach to assess ongoing online BMI performance is introduced. Next, an adaptive neural network-based system for assistive robot control is presented that employs exploratory learning to achieve the coordinated motor planning needed to navigate toward, reach for, and grasp distant objects. Exploratory learning, or “learning by doing,” is an unsupervised method in which the robot is able to build an internal model for motor planning and coordination based on real-time sensory inputs received during exploration. Finally, a software platform intended for practical BMI application use is developed and evaluated. Using online c-VEP methods, users control a simple 2D cursor control game, a basic augmentative and alternative communication tool, and an assistive robot, both manually and via high-level goal-oriented commands

    Bio-signal based control in assistive robots: a survey

    Get PDF
    Recently, bio-signal based control has been gradually deployed in biomedical devices and assistive robots for improving the quality of life of disabled and elderly people, among which electromyography (EMG) and electroencephalography (EEG) bio-signals are being used widely. This paper reviews the deployment of these bio-signals in the state of art of control systems. The main aim of this paper is to describe the techniques used for (i) collecting EMG and EEG signals and diving these signals into segments (data acquisition and data segmentation stage), (ii) dividing the important data and removing redundant data from the EMG and EEG segments (feature extraction stage), and (iii) identifying categories from the relevant data obtained in the previous stage (classification stage). Furthermore, this paper presents a summary of applications controlled through these two bio-signals and some research challenges in the creation of these control systems. Finally, a brief conclusion is summarized

    EEG and ECoG features for Brain Computer Interface in Stroke Rehabilitation

    Get PDF
    The ability of non-invasive Brain-Computer Interface (BCI) to control an exoskeleton was used for motor rehabilitation in stroke patients or as an assistive device for the paralyzed. However, there is still a need to create a more reliable BCI that could be used to control several degrees of Freedom (DoFs) that could improve rehabilitation results. Decoding different movements from the same limb, high accuracy and reliability are some of the main difficulties when using conventional EEG-based BCIs and the challenges we tackled in this thesis. In this PhD thesis, we investigated that the classification of several functional hand reaching movements from the same limb using EEG is possible with acceptable accuracy. Moreover, we investigated how the recalibration could affect the classification results. For this reason, we tested the recalibration in each multi-class decoding for within session, recalibrated between-sessions, and between sessions. It was shown the great influence of recalibrating the generated classifier with data from the current session to improve stability and reliability of the decoding. Moreover, we used a multiclass extension of the Filter Bank Common Spatial Patterns (FBCSP) to improve the decoding accuracy based on features and compared it to our previous study using CSP. Sensorimotor-rhythm-based BCI systems have been used within the same frequency ranges as a way to influence brain plasticity or controlling external devices. However, neural oscillations have shown to synchronize activity according to motor and cognitive functions. For this reason, the existence of cross-frequency interactions produces oscillations with different frequencies in neural networks. In this PhD, we investigated for the first time the existence of cross-frequency coupling during rest and movement using ECoG in chronic stroke patients. We found that there is an exaggerated phase-amplitude coupling between the phase of alpha frequency and the amplitude of gamma frequency, which can be used as feature or target for neurofeedback interventions using BCIs. This coupling has been also reported in another neurological disorder affecting motor function (Parkinson and dystonia) but, to date, it has not been investigated in stroke patients. This finding might change the future design of assistive or therapeuthic BCI systems for motor restoration in stroke patients

    NeuroClean: multipurpose neural data preprocessing pipeline

    Full text link
    Treballs Finals de Grau d'Enginyeria InformĂ tica, Facultat de MatemĂ tiques, Universitat de Barcelona, Any: 2023, Director: Ignasi Cos Aguilera i Michael DePass[en] Electroencephalography (EEG) and Local field potentials (LFP) are two commonly used measures of electrical activity in the brain. These signals are used extensively in both industry and research and have many real world applications. Before any analyses can be performed on EEG/LFP, however, the data must first be cleaned. The main objective of this project was to create an unsupervised, multipurpose EEG/LFP preprocessing pipeline. Its unsupervised nature would, consequently, help alleviate problems involving reproducibility and biases that arise from human intervention. Moreover, manual signal cleaning is time and labor intensive. The adoption of an automated workflow would, therefore, save researchers valuable time and resources. A secondary goal was to allow the pipeline to be fit to several use cases, thus standardizing the cleaning methods used in neuroscience. We designed an automated EEG/LFP preprocessing pipeline, NeuroClean, which consists of five steps: bandpass filtering, line noise filtering, bad channel rejection, and independent component analysis with automatic component rejection based on a clustering algorithm. Machine learning classifiers were used to ensure task-relevant signals were preserved after each step of the cleaning process. We used an LFP dataset recorded from a cynomolgus macaque to validate the pipeline. Data was recorded while the monkey performed a reach-to-grasp task, and three sections of the movement were used for classification. NeuroClean appeared to remove several common types of artifacts from the signal. Moreover, it yielded over 97% accuracy (whereas chance-level is 33.3%) in an optimized Multinomial Logistic Regression model after cleaning the data, compared to the raw data which performed at 74% accuracy. The results show that NeuroClean is a promising pipeline and workflow that may be explored in the future

    The evolution of AI approaches for motor imagery EEG-based BCIs

    Full text link
    The Motor Imagery (MI) electroencephalography (EEG) based Brain Computer Interfaces (BCIs) allow the direct communication between humans and machines by exploiting the neural pathways connected to motor imagination. Therefore, these systems open the possibility of developing applications that could span from the medical field to the entertainment industry. In this context, Artificial Intelligence (AI) approaches become of fundamental importance especially when wanting to provide a correct and coherent feedback to BCI users. Moreover, publicly available datasets in the field of MI EEG-based BCIs have been widely exploited to test new techniques from the AI domain. In this work, AI approaches applied to datasets collected in different years and with different devices but with coherent experimental paradigms are investigated with the aim of providing a concise yet sufficiently comprehensive survey on the evolution and influence of AI techniques on MI EEG-based BCI data.Comment: Submitted to Italian Workshop on Artificial Intelligence for Human Machine Interaction (AIxHMI 2022), December 02, 2022, Udine, Ital
    • …
    corecore