421 research outputs found

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Electroencephalogram Signal Processing For Hybrid Brain Computer Interface Systems

    Get PDF
    The goal of this research was to evaluate and compare three types of brain computer interface (BCI) systems, P300, steady state visually evoked potentials (SSVEP) and Hybrid as virtual spelling paradigms. Hybrid BCI is an innovative approach to combine the P300 and SSVEP. However, it is challenging to process the resulting hybrid signals to extract both information simultaneously and effectively. The major step executed toward the advancement to modern BCI system was to move the BCI techniques from traditional LED system to electronic LCD monitor. Such a transition allows not only to develop the graphics of interest but also to generate objects flickering at different frequencies. There were pilot experiments performed for designing and tuning the parameters of the spelling paradigms including peak detection for different range of frequencies of SSVEP BCI, placement of objects on LCD monitor, design of the spelling keyboard, and window time for the SSVEP peak detection processing. All the experiments were devised to evaluate the performance in terms of the spelling accuracy, region error, and adjacency error among all of the paradigms: P300, SSVEP and Hybrid. Due to the different nature of P300 and SSVEP, designing a hybrid P300-SSVEP signal processing scheme demands significant amount of research work in this area. Eventually, two critical questions in hybrid BCl are: (1) which signal processing strategy can best measure the user\u27s intent and (2) what a suitable paradigm is to fuse these two techniques in a simple but effective way. In order to answer these questions, this project focused mainly on developing signal processing and classification technique for hybrid BCI. Hybrid BCI was implemented by extracting the specific information from brain signals, selecting optimum features which contain maximum discrimination information about the speller characters of our interest and by efficiently classifying the hybrid signals. The designed spellers were developed with the aim to improve quality of life of patients with disability by utilizing visually controlled BCI paradigms. The paradigms consist of electrodes to record electroencephalogram signal (EEG) during stimulation, a software to analyze the collected data, and a computing device where the subject’s EEG is the input to estimate the spelled character. Signal processing phase included preliminary tasks as preprocessing, feature extraction, and feature selection. Captured EEG data are usually a superposition of the signals of interest with other unwanted signals from muscles, and from non-biological artifacts. The accuracy of each trial and average accuracy for subjects were computed. Overall, the average accuracy of the P300 and SSVEP spelling paradigm was 84% and 68.5 %. P300 spelling paradigms have better accuracy than both the SSVEP and hybrid paradigm. Hybrid paradigm has the average accuracy of 79 %. However, hybrid system is faster in time and more soothing to look than other paradigms. This work is significant because it has great potential for improving the BCI research in design and application of clinically suitable speller paradigm

    Brain-Switches for Asynchronous Brain−Computer Interfaces: A Systematic Review

    Get PDF
    A brain–computer interface (BCI) has been extensively studied to develop a novel communication system for disabled people using their brain activities. An asynchronous BCI system is more realistic and practical than a synchronous BCI system, in that, BCI commands can be generated whenever the user wants. However, the relatively low performance of an asynchronous BCI system is problematic because redundant BCI commands are required to correct false-positive operations. To significantly reduce the number of false-positive operations of an asynchronous BCI system, a two-step approach has been proposed using a brain-switch that first determines whether the user wants to use an asynchronous BCI system before the operation of the asynchronous BCI system. This study presents a systematic review of the state-of-the-art brain-switch techniques and future research directions. To this end, we reviewed brain-switch research articles published from 2000 to 2019 in terms of their (a) neuroimaging modality, (b) paradigm, (c) operation algorithm, and (d) performance

    Development and applications of a smartphone-based mobile electroencephalography (EEG) system

    Get PDF
    Electroencephalography (EEG) is a clinical and research technique used to non-invasively acquire brain activity. EEG is performed using static systems in specialist laboratories where participant mobility is constrained. It is desirable to have EEG systems which enable acquisition of brain activity outside such settings. Mobile systems seek to reduce the constraining factors of EEG device and participant mobility to enable recordings in various environments but have had limited success due to various factors including low system specification. The main aim of this thesis was to design, build, test and validate a novel smartphone-based mobile EEG system.A literature review found that the term ‘mobile EEG’ has an ambiguous meaning as researchers have used it to describe many differing degrees of participant and device mobility. A novel categorisation of mobile EEG (CoME) scheme was derived from thirty published EEG studies which defined scores for participant and device mobilities, and system specifications. The CoME scheme was subsequently applied to generate a specification for the proposed mobile EEG system which had 24 channels, sampled at 24 bit at a rate of 250 Hz. Unique aspects of the EEG system were the introduction of a smartphone into the specification, along with the use of Wi-Fi for communications. The smartphone’s processing power was used to remotely control the EEG device so as to enable EEG data capture and storage as well as electrode impedance checking via the app. This was achieved by using the Unity game engine to code an app which provided the flexibility for future development possibilities with its multi-platform support.The prototype smartphone-based waist-mounted mobile EEG system (termed ‘io:bio’) was validated against a commercial FDA clinically approved mobile system (Micromed). The power spectral frequency, amplitude and area of alpha frequency waves were determined in participants with their eyes closed in various postures: lying, sitting, standing and standing with arms raised. Since a correlation analysis to compare two systems has interpretability problems, Bland and Altman plots were utilised with a priori justified limits of agreement to statistically assess the agreement between the two EEG systems. Overall, the results found similar agreements between the io:bio and Micromed systems indicating that the systems could be used interchangeably. Utilising the io:bio and Micromed systems in a walking configuration, led to contamination of EEG channels with artifacts thought to arise from movement and muscle-related sources, and electrode displacement.To enable an event related potential (ERP) capability of the EEG system, additional coding of the smartphone app was undertaken to provide stimulus delivery and associated data marking. Using the waist-mounted io:bio system, an auditory oddball paradigm was also coded into the app, and delivery of auditory tones (standard and deviant) to the participant (sitting posture) achieved via headphones connected to the smartphone. N100, N200 and P300 ERP components were recorded in participants sitting, and larger amplitudes were found for the deviant tones compared to the standard ones. In addition, when the paradigm was tested in individual participants during walking, movement-related artifacts impacted negatively upon the quality of the ERP components, although components were discernible in the grand mean ERP.The io:bio system was redesigned into a head-mounted configuration in an attempt to reduce EEG artifacts during participant walking. The initial approach taken to redesign the system involved using electronic components populated onto a flexible PCB proved to be non-robust. Instead, the rigid PCB form of the circuitry was taken from the io:bio waist-mounted system and placed onto the rear head section of the electrode cap via a bespoke cradle. Using this head-mounted system, in a preliminary auditory oddball paradigm study, ERP responses were obtained in participants whilst walking. Initial results indicate that artifacts are reduced in this head-mounted configuration, and N100, N200 and P300 components are clearly identifiable in some channels

    Review of real brain-controlled wheelchairs

    Get PDF
    This paper presents a review of the state of the art regarding wheelchairs driven by a brain-computer interface (BCI). Using a brain-controlled wheelchair (BCW), disabled users could handle a wheelchair through their brain activity, granting autonomy to move through an experimental environment. A classification is established, based on the characteristics of the BCW, such as the type of electroencephalographic (EEG) signal used, the navigation system employed by the wheelchair, the task for the participants, or the metrics used to evaluate the performance. Furthermore, these factors are compared according to the type of signal used, in order to clarify the differences among them. Finally, the trend of current research in this field is discussed, as well as the challenges that should be solved in the future

    Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm

    Full text link
    [EN] Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost robotic arm control system with an EEG-based brain-computer interface (BCI). The BCI system relays on the Steady State Visually Evoked Potentials (SSVEP) paradigm. A cross-platform application was obtained in C++. This C++ platform, together with the open-source software Openvibe was used to control a Staubli robot arm model TX60. Communication between Openvibe and the robot was carried out through the Virtual Reality Peripheral Network (VRPN) protocol. EEG signals were acquired with the 8-channel Enobio amplifier from Neuroelectrics. For the processing of the EEG signals, Common Spatial Pattern (CSP) filters and a Linear Discriminant Analysis classifier (LDA) were used. Five healthy subjects tried the BCI. This work allowed the communication and integration of a well-known BCI development platform such as Openvibe with the specific control software of a robot arm such as Staubli TX60 using the VRPN protocol. It can be concluded from this study that it is possible to control the robotic arm with an SSVEP-based BCI with a reduced number of dry electrodes to facilitate the use of the system.Funding for open access charge: Universitat Politecnica de Valencia.Quiles Cucarella, E.; Dadone, J.; Chio, N.; García Moreno, E. (2022). Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm. Sensors. 22(13):1-26. https://doi.org/10.3390/s22135000126221

    Validation of Low-cost Wireless EEG System for Measuring Event-related Potentials

    Get PDF
    This study used the traditional P300 speller paradigm to compare a medical grade Electroencephalography (EEG) system, the G.Tec, with a consumer grade EEG system, the Emotiv, in the detection of P300 components within Event Related Potential (ERP) signals. The experiment focused on four electrodes known to produce optically induced visual evoked potential. A successful comparison of the two approaches was made. It was shown that both systems could measure an ERP. The paper concludes with discussion comparing the low-cost wireless EEG system with the medical grade EEG system

    Robust Brain-computer interface for virtual Keyboard (RoBIK): project results

    Get PDF
    Special issue : ANR TECSAN : Technologies for Health and AutonomyNational audienceBrain-ComputerInterface (BCI)is a technology that translatesthe brain electrical activity into a command for a device such as a robotic arm, a wheelchair or a spelling device. BCIs have long been described as an assistive technology forseverely disabled patients because they completely bypass the need for muscular activity. The clinical reality is however dramatically different and most patients who use BCIs today are doing so as part of constraining clinical trials. To achieve the technological transfer from bench to bedside, BCI must gain ease of use and robustness of bothmeasure (electroencephalography [EEG]) and interface (signal processing and applications).TheRobustBrain-computerInterface for virtual Keyboard (RoBIK) project aimed atthe development of aBCIsystemfor communication that could be used on a daily basis by patientswithoutthe help of a trained teamofresearchers.To guide further developments cliniciansfirst assessed patients' needs.The prototype subsequently developed consisted in a 14 felt-pad electrodes EEG headsetsampling at 256Hz by an electronic component capable of transmitting signals wirelessly. The application was a virtual keyboard generating a novelstimulation paradigm to elicit P300 Evoked Related Potentials(ERPs) for communication. Raw EEG signals were treated with OpenViBE open-source software including novelsignal processing and stimulation techniques

    Study of soft materials, flexible electronics, and machine learning for fully portable and wireless brain-machine interfaces

    Get PDF
    Over 300,000 individuals in the United States are afflicted with some form of limited motor function from brainstem or spinal-cord related injury resulting in quadriplegia or some form of locked-in syndrome. Conventional brain-machine interfaces used to allow for communication or movement require heavy, rigid components, uncomfortable headgear, excessive numbers of electrodes, and bulky electronics with long wires that result in greater data artifacts and generally inadequate performance. Wireless, wearable electroencephalograms, along with dry non-invasive electrodes can be utilized to allow recording of brain activity on a mobile subject to allow for unrestricted movement. Additionally, multilayer microfabricated flexible circuits, when combined with a soft materials platform allows for imperceptible wearable data acquisition electronics for long term recording. This dissertation aims to introduce new electronics and training paradigms for brain-machine interfaces to provide remedies in the form of communication and movement for these individuals. Here, training is optimized by generating a virtual environment from which a subject can achieve immersion using a VR headset in order to train and familiarize with the system. Advances in hardware and implementation of convolutional neural networks allow for rapid classification and low-latency target control. Integration of materials, mechanics, circuit and electrode design results in an optimized brain-machine interface allowing for rehabilitation and overall improved quality of life.Ph.D
    corecore