271 research outputs found

    Review of real brain-controlled wheelchairs

    Get PDF
    This paper presents a review of the state of the art regarding wheelchairs driven by a brain-computer interface (BCI). Using a brain-controlled wheelchair (BCW), disabled users could handle a wheelchair through their brain activity, granting autonomy to move through an experimental environment. A classification is established, based on the characteristics of the BCW, such as the type of electroencephalographic (EEG) signal used, the navigation system employed by the wheelchair, the task for the participants, or the metrics used to evaluate the performance. Furthermore, these factors are compared according to the type of signal used, in order to clarify the differences among them. Finally, the trend of current research in this field is discussed, as well as the challenges that should be solved in the future

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Defining brain–machine interface applications by matching interface performance with device requirements

    Get PDF
    Interaction with machines is mediated by human-machine interfaces (HMIs). Brain-machine interfaces (BMIs) are a particular class of HMIs and have so far been studied as a communication means for people who have little or no voluntary control of muscle activity. In this context, low-performing interfaces can be considered as prosthetic applications. On the other hand, for able-bodied users, a BMI would only be practical if conceived as an augmenting interface. In this paper, a method is introduced for pointing out effective combinations of interfaces and devices for creating real-world applications. First, devices for domotics, rehabilitation and assistive robotics, and their requirements, in terms of throughput and latency, are described. Second, HMIs are classified and their performance described, still in terms of throughput and latency. Then device requirements are matched with performance of available interfaces. Simple rehabilitation and domotics devices can be easily controlled by means of BMI technology. Prosthetic hands and wheelchairs are suitable applications but do not attain optimal interactivity. Regarding humanoid robotics, the head and the trunk can be controlled by means of BMIs, while other parts require too much throughput. Robotic arms, which have been controlled by means of cortical invasive interfaces in animal studies, could be the next frontier for non-invasive BMIs. Combining smart controllers with BMIs could improve interactivity and boost BMI applications. © 2007 Elsevier B.V. All rights reserved

    A LOW COST EEG BASED BCI PROSTHETIC USING MOTOR IMAGERY

    Get PDF
    Brain Computer Interfaces (BCI) provide the opportunity to control external devices using the brain ElectroEncephaloGram (EEG) signals. In this paper we propose two software framework in order to control a 5 degree of freedom robotic and prosthetic hand. Results are presented where an Emotiv Cognitive Suite (i.e. the 1st framework) combined with an embedded software system (i.e. an open source Arduino board) is able to control the hand through character input associated with the taught actions of the suite. This system provides evidence of the feasibility of brain signals being a viable approach to controlling the chosen prosthetic. Results are then presented in the second framework. This latter one allowed for the training and classification of EEG signals for motor imagery tasks. When analysing the system, clear visual representations of the performance and accuracy are presented in the results using a confusion matrix, accuracy measurement and a feedback bar signifying signal strength. Experiments with various acquisition datasets were carried out and with a critical evaluation of the results given. Finally depending on the classification of the brain signal a Python script outputs the driving command to the Arduino to control the prosthetic. The proposed architecture performs overall good results for the design and implementation of economically convenient BCI and prosthesis

    Implementation of Robotic arm control with Emotiv Epoc

    Full text link
    Brain Computer Interface (BCI) has opened up a new hope for people suffering from severe motor disabilities, having no physical activities caused due to disease or injury to the central or peripheral nervous system. A BCI based robotic arm movement control is designed and implemented. The proposed system acquires data from the scalp of subjects a group of sensors. Emotiv EPOC a commercially available EEG headset is used, which analyzes the acquired EEG signals real time. The signals are processed and accordingly commands are issued for different movements which will be based on the characteristic patterns for various facial expressions, human emotions and cognitive actions. The idea is to combine the user intent with a robotic arm to achieve the user initiated motor movements

    Empowering and assisting natural human mobility: The simbiosis walker

    Get PDF
    This paper presents the complete development of the Simbiosis Smart Walker. The device is equipped with a set of sensor subsystems to acquire user-machine interaction forces and the temporal evolution of user's feet during gait. The authors present an adaptive filtering technique used for the identification and separation of different components found on the human-machine interaction forces. This technique allowed isolating the components related with the navigational commands and developing a Fuzzy logic controller to guide the device. The Smart Walker was clinically validated at the Spinal Cord Injury Hospital of Toledo - Spain, presenting great acceptability by spinal chord injury patients and clinical staf

    EMG-based eye gestures recognition for hands free interfacing

    Get PDF
    This study investigates the utilization of an Electromyography (EMG) based device to recognize five eye gestures and classify them to have a hands free interaction with different applications. The proposed eye gestures in this work includes Long Blinks, Rapid Blinks, Wink Right, Wink Left and finally Squints or frowns. The MUSE headband, which is originally a Brain Computer Interface (BCI) that measures the Electroencephalography (EEG) signals, is the device used in our study to record the EMG signals from behind the earlobes via two Smart rubber sensors and at the forehead via two other electrodes. The signals are considered as EMG once they involve the physical muscular stimulations, which are considered as artifacts for the EEG Brain signals for other studies. The experiment is conducted on 15 participants (12 Males and 3 Females) randomly as no specific groups were targeted and the session was video taped for reevaluation. The experiment starts with the calibration phase to record each gesture three times per participant through a developed Voice narration program to unify the test conditions and time intervals among all subjects. In this study, a dynamic sliding window with segmented packets is designed to faster process the data and analyze it, as well as to provide more flexibility to classify the gestures regardless their duration from one user to another. Additionally, we are using the thresholding algorithm to extract the features from all the gestures. The Rapid Blinks and the Squints were having high F1 Scores of 80.77% and 85.71% for the Trained Thresholds, as well as 87.18% and 82.12% for the Default or manually adjusted thresholds. The accuracies of the Long Blinks, Rapid Blinks and Wink Left were relatively higher with the manually adjusted thresholds, while the Squints and the Wink Right were better with the trained thresholds. However, more improvements were proposed and some were tested especially after monitoring the participants actions from the video recordings to enhance the classifier. Most of the common irregularities met are discussed within this study so as to pave the road for further similar studies to tackle them before conducting the experiments. Several applications need minimal physical or hands interactions and this study was originally a part of the project at HCI Lab, University of Stuttgart to make a hands-free switching between RGB, thermal and depth cameras integrated on or embedded in an Augmented Reality device designed for the firefighters to increase their visual capabilities in the field

    Human Computer Interactions for Amyotrophic Lateral Sclerosis Patients

    Get PDF

    Brain-computer interface for robot control with eye artifacts for assistive applications

    Get PDF
    Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously or voluntarily produce movements other than those involving the eyes or eyelids. In this context, Brain-Computer Interface (BCI) systems present an alternative way to communicate or interact with the external world. In order to improve the lives of people with disabilities, this paper presents a novel BCI to control an assistive robot with user's eye artifacts. In this study, eye artifacts that contaminate the electroencephalogram (EEG) signals are considered a valuable source of information thanks to their high signal-to-noise ratio and intentional generation. The proposed methodology detects eye artifacts from EEG signals through characteristic shapes that occur during the events. The lateral movements are distinguished by their ordered peak and valley formation and the opposite phase of the signals measured at F7 and F8 channels. This work, as far as the authors' knowledge, is the first method that used this behavior to detect lateral eye movements. For the blinks detection, a double-thresholding method is proposed by the authors to catch both weak blinks as well as regular ones, differentiating itself from the other algorithms in the literature that normally use only one threshold. Real-time detected events with their virtual time stamps are fed into a second algorithm, to further distinguish between double and quadruple blinks from single blinks occurrence frequency. After testing the algorithm offline and in realtime, the algorithm is implemented on the device. The created BCI was used to control an assistive robot through a graphical user interface. The validation experiments including 5 participants prove that the developed BCI is able to control the robot

    Neuroengineering Tools/Applications for Bidirectional Interfaces, Brain–Computer Interfaces, and Neuroprosthetic Implants – A Review of Recent Progress

    Get PDF
    The main focus of this review is to provide a holistic amalgamated overview of the most recent human in vivo techniques for implementing brain–computer interfaces (BCIs), bidirectional interfaces, and neuroprosthetics. Neuroengineering is providing new methods for tackling current difficulties; however neuroprosthetics have been studied for decades. Recent progresses are permitting the design of better systems with higher accuracies, repeatability, and system robustness. Bidirectional interfaces integrate recording and the relaying of information from and to the brain for the development of BCIs. The concepts of non-invasive and invasive recording of brain activity are introduced. This includes classical and innovative techniques like electroencephalography and near-infrared spectroscopy. Then the problem of gliosis and solutions for (semi-) permanent implant biocompatibility such as innovative implant coatings, materials, and shapes are discussed. Implant power and the transmission of their data through implanted pulse generators and wireless telemetry are taken into account. How sensation can be relayed back to the brain to increase integration of the neuroengineered systems with the body by methods such as micro-stimulation and transcranial magnetic stimulation are then addressed. The neuroprosthetic section discusses some of the various types and how they operate. Visual prosthetics are discussed and the three types, dependant on implant location, are examined. Auditory prosthetics, being cochlear or cortical, are then addressed. Replacement hand and limb prosthetics are then considered. These are followed by sections concentrating on the control of wheelchairs, computers and robotics directly from brain activity as recorded by non-invasive and invasive techniques
    corecore