133 research outputs found

    A Wireless Multifunctional SSVEP-Based Brain Computer Interface Assistive System

    Full text link
    IEEE Several kinds of brain-computer interface (BCI) systems have been proposed to compensate for the lack of medical technology for assisting patients who lose the ability to use motor functions to communicate with the outside world. However, most of the proposed systems are limited by their non-portability, impracticality and inconvenience because of the adoption of wired or invasive electroencephalography (EEG) acquisition devices. Another common limitation is the shortage of functions provided because of the difficulty of integrating multiple functions into one BCI system. In this study, we propose a wireless, non-invasive and multifunctional assistive system which integrates steady state visually evoked potential (SSVEP)-based BCI and a robotic arm to assist patients to feed themselves. Patients are able to control the robotic arm via the BCI to serve themselves food. Three other functions: video entertainment, video calling, and active interaction are also integrated. This is achieved by designing a functional menu and integrating multiple subsystems. A refinement decision-making mechanism is incorporated to ensure the accuracy and applicability of the system. Fifteen participants were recruited to validate the usability and performance of the system. The averaged accuracy and information transfer rate (ITR) achieved is 90.91% and 24.94 bit per min respectively. The feedback from the participants demonstrates that this assistive system is able to significantly improve the quality of daily life

    Review of real brain-controlled wheelchairs

    Get PDF
    This paper presents a review of the state of the art regarding wheelchairs driven by a brain-computer interface (BCI). Using a brain-controlled wheelchair (BCW), disabled users could handle a wheelchair through their brain activity, granting autonomy to move through an experimental environment. A classification is established, based on the characteristics of the BCW, such as the type of electroencephalographic (EEG) signal used, the navigation system employed by the wheelchair, the task for the participants, or the metrics used to evaluate the performance. Furthermore, these factors are compared according to the type of signal used, in order to clarify the differences among them. Finally, the trend of current research in this field is discussed, as well as the challenges that should be solved in the future

    Human-machine interfaces based on EMG and EEG applied to robotic systems

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two different Human-Machine Interfaces (HMIs) were developed, both based on electro-biological signals. One is based on the EMG signal and the other is based on the EEG signal. Two major features of such interfaces are their relatively simple data acquisition and processing systems, which need just a few hardware and software resources, so that they are, computationally and financially speaking, low cost solutions. Both interfaces were applied to robotic systems, and their performances are analyzed here. The EMG-based HMI was tested in a mobile robot, while the EEG-based HMI was tested in a mobile robot and a robotic manipulator as well.</p> <p>Results</p> <p>Experiments using the EMG-based HMI were carried out by eight individuals, who were asked to accomplish ten eye blinks with each eye, in order to test the eye blink detection algorithm. An average rightness rate of about 95% reached by individuals with the ability to blink both eyes allowed to conclude that the system could be used to command devices. Experiments with EEG consisted of inviting 25 people (some of them had suffered cases of meningitis and epilepsy) to test the system. All of them managed to deal with the HMI in only one training session. Most of them learnt how to use such HMI in less than 15 minutes. The minimum and maximum training times observed were 3 and 50 minutes, respectively.</p> <p>Conclusion</p> <p>Such works are the initial parts of a system to help people with neuromotor diseases, including those with severe dysfunctions. The next steps are to convert a commercial wheelchair in an autonomous mobile vehicle; to implement the HMI onboard the autonomous wheelchair thus obtained to assist people with motor diseases, and to explore the potentiality of EEG signals, making the EEG-based HMI more robust and faster, aiming at using it to help individuals with severe motor dysfunctions.</p

    Support vector machines to detect physiological patterns for EEG and EMG-based human-computer interaction:a review

    Get PDF
    Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported

    A hybrid brain-computer interface based on motor intention and visual working memory

    Get PDF
    Non-invasive electroencephalography (EEG) based brain-computer interface (BCI) is able to provide alternative means for people with disabilities to communicate with and control over external assistive devices. A hybrid BCI is designed and developed for following two types of system (control and monitor). Our first goal is to create a signal decoding strategy that allows people with limited motor control to have more command over potential prosthetic devices. Eight healthy subjects were recruited to perform visual cues directed reaching tasks. Eye and motion artifacts were identified and removed to ensure that the subjects\u27 visual fixation to the target locations would have little or no impact on the final result. We applied a Fisher Linear Discriminate (FLD) analysis for single-trial classification of the EEG to decode the intended arm movement in the left, right, and forward directions (before the onsets of actual movements). The mean EEG signal amplitude near the PPC region 271-310 ms after visual stimulation was found to be the dominant feature for best classification results. A signal scaling factor developed was found to improve the classification accuracy from 60.11% to 93.91% in the two-class (left versus right) scenario. This result demonstrated great promises for BCI neuroprosthetics applications, as motor intention decoding can be served as a prelude to the classification of imagined motor movement to assist in motor disable rehabilitation, such as prosthetic limb or wheelchair control. The second goal is to develop the adaptive training for patients with low visual working memory (VWM) capacity to improve cognitive abilities and healthy individuals who seek to enhance their intellectual performance. VWM plays a critical role in preserving and processing information. It is associated with attention, perception and reasoning, and its capacity can be used as a predictor of cognitive abilities. Recent evidence has suggested that with training, one can enhance the VWM capacity and attention over time. Not only can these studies reveal the characteristics of VWM load and the influences of training, they may also provide effective rehabilitative means for patients with low VWM capacity. However, few studies have investigated VWM over a long period of time, beyond 5-weeks. In this study, a combined behavioral approach and EEG was used to investigate VWM load, gain, and transfer. The results reveal that VWM capacity is directly correlated to the reaction time and contralateral delay amplitude (CDA). The approximate magic number 4 was observed through the event-related potentials (ERPs) waveforms, where the average capacity is 2.8-item from 15 participants. In addition, the findings indicate that VWM capacity can be improved through adaptive training. Furthermore, after training exercises, participants from the training group are able to improve their performance accuracies dramatically compared to the control group. Adaptive training gains on non-trained tasks can also be observed at 12 weeks after training. Therefore, we conclude that all participants can benefit from training gains, and augmented VWM capacity can be sustained over a long period of time. Our results suggest that this form of training can significantly improve cognitive function and may be useful for enhancing the user performance on neuroprosthetics device

    Human Computer Interactions for Amyotrophic Lateral Sclerosis Patients

    Get PDF

    A survey on bio-signal analysis for human-robot interaction

    Get PDF
    The use of bio-signals analysis in human-robot interaction is rapidly increasing. There is an urgent demand for it in various applications, including health care, rehabilitation, research, technology, and manufacturing. Despite several state-of-the-art bio-signals analyses in human-robot interaction (HRI) research, it is unclear which one is the best. In this paper, the following topics will be discussed: robotic systems should be given priority in the rehabilitation and aid of amputees and disabled people; second, domains of feature extraction approaches now in use, which are divided into three main sections (time, frequency, and time-frequency). The various domains will be discussed, then a discussion of each domain's benefits and drawbacks, and finally, a recommendation for a new strategy for robotic systems

    A Python-based Brain-Computer Interface Package for Neural Data Analysis

    Get PDF
    Anowar, Md Hasan, A Python-based Brain-Computer Interface Package for Neural Data Analysis. Master of Science (MS), December, 2020, 70 pp., 4 tables, 23 figures, 74 references. Although a growing amount of research has been dedicated to neural engineering, only a handful of software packages are available for brain signal processing. Popular brain-computer interface packages depend on commercial software products such as MATLAB. Moreover, almost every brain-computer interface software is designed for a specific neuro-biological signal; there is no single Python-based package that supports motor imagery, sleep, and stimulated brain signal analysis. The necessity to introduce a brain-computer interface package that can be a free alternative for commercial software has motivated me to develop a toolbox using the python platform. In this thesis, the structure of MEDUSA, a brain-computer interface toolbox, is presented. The features of the toolbox are demonstrated with publicly available data sources. The MEDUSA toolbox provides a valuable tool to biomedical engineers and computational neuroscience researchers

    Chapter 15 Matching Brain–Machine Interface Performance to Space Applications

    Get PDF
    A brain-machine interface (BMI) is a particular class of human-machine interface (HMI). BMIs have so far been studied mostly as a communication means for people who have little or no voluntary control of muscle activity. For able-bodied users, such as astronauts, a BMI would only be practical if conceived as an augmenting interface. A method is presented for pointing out effective combinations of HMIs and applications of robotics and automation to space. Latency and throughput are selected as performance measures for a hybrid bionic system (HBS), that is, the combination of a user, a device, and a HMI. We classify and briefly describe HMIs and space applications and then compare the performance of classes of interfaces with the requirements of classes of applications, both in terms of latency and throughput. Regions of overlap correspond to effective combinations. Devices requiring simpler control, such as a rover, a robotic camera, or environmental controls are suitable to be driven by means of BMI technology. Free flyers and other devices with six degrees of freedom can be controlled, but only at low-interactivity levels. More demanding applications require conventional interfaces, although they could be controlled by BMIs once the same levels of performance as currently recorded in animal experiments are attained. Robotic arms and manipulators could be the next frontier for noninvasive BMIs. Integrating smart controllers in HBSs could improve interactivity and boost the use of BMI technology in space applications. © 2009 Elsevier Inc. All rights reserved
    corecore