29,912 research outputs found

    Brain–computer interface and assist-as-needed model for upper limb robotic arm

    Get PDF
    https://journals.sagepub.com/doi/10.1177/1687814019875537Post-stroke paralysis, whereby subjects loose voluntary control over muscle actuation, is one of the main causes of disability. Repetitive physical therapy can reinstate lost motions and strengths through neuroplasticity. However, manually delivered therapies are becoming ineffective due to scarcity of therapists, subjectivity in the treatment, and lack of patient motivation. Robot-assisted physical therapy is being researched these days to impart an evidence-based systematic treatment. Recently, intelligent controllers and brain–computer interface are proposed for rehabilitation robots to encourage patient participation which is the key to quick recovery. In the present work, a brain–computer interface and assist-as-needed training paradigm have been proposed for an upper limb rehabilitation robot. The brain–computer interface system is implemented with the use of electroencephalography sensor; moreover, backdrivability in the actuator has been achieved with the use of assist-as-needed control approach, which allows subjects to move the robot actively using their limited motions and strengths. The robot only assists for the remaining course of trajectory which subjects are unable to perform themselves. The robot intervention point is obtained from the patient’s intent which is captured through brain–computer interface. Problems encountered during the practical implementation of brain–computer interface and achievement of backdrivability in the actuator have been discussed and resolved

    Robot Motion Control Using the Emotiv EPOC EEG System

    Get PDF
    Brain-computer interfaces have been explored for years with the intent of using human thoughts to control mechanical system. By capturing the transmission of signals directly from the human brain or electroencephalogram (EEG), human thoughts can be made as motion commands to the robot. This paper presents a prototype for an electroencephalogram (EEG) based brain-actuated robot control system using mental commands. In this study, Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) method were combined to establish the best model. Dataset containing features of EEG signals were obtained from the subject non-invasively using Emotiv EPOC headset. The best model was then used by Brain-Computer Interface (BCI) to classify the EEG signals into robot motion commands to control the robot directly. The result of the classification gave the average accuracy of 69.06%

    Brain-Computer Interface meets ROS: A robotic approach to mentally drive telepresence robots

    Get PDF
    This paper shows and evaluates a novel approach to integrate a non-invasive Brain-Computer Interface (BCI) with the Robot Operating System (ROS) to mentally drive a telepresence robot. Controlling a mobile device by using human brain signals might improve the quality of life of people suffering from severe physical disabilities or elderly people who cannot move anymore. Thus, the BCI user is able to actively interact with relatives and friends located in different rooms thanks to a video streaming connection to the robot. To facilitate the control of the robot via BCI, we explore new ROS-based algorithms for navigation and obstacle avoidance, making the system safer and more reliable. In this regard, the robot can exploit two maps of the environment, one for localization and one for navigation, and both can be used also by the BCI user to watch the position of the robot while it is moving. As demonstrated by the experimental results, the user's cognitive workload is reduced, decreasing the number of commands necessary to complete the task and helping him/her to keep attention for longer periods of time.Comment: Accepted in the Proceedings of the 2018 IEEE International Conference on Robotics and Automatio

    Brain Computer Interface for Gesture Control of a Social Robot: an Offline Study

    Full text link
    Brain computer interface (BCI) provides promising applications in neuroprosthesis and neurorehabilitation by controlling computers and robotic devices based on the patient's intentions. Here, we have developed a novel BCI platform that controls a personalized social robot using noninvasively acquired brain signals. Scalp electroencephalogram (EEG) signals are collected from a user in real-time during tasks of imaginary movements. The imagined body kinematics are decoded using a regression model to calculate the user-intended velocity. Then, the decoded kinematic information is mapped to control the gestures of a social robot. The platform here may be utilized as a human-robot-interaction framework by combining with neurofeedback mechanisms to enhance the cognitive capability of persons with dementia.Comment: Presented in: 25th Iranian Conference on Electrical Engineering (ICEE

    Brain Computer Interface for Micro-controller Driven Robot Based on Emotiv Sensors

    Get PDF
    A Brain Computer Interface (BCI) is developed to navigate a micro-controller based robot using Emotiv sensors. The BCI system has a pipeline of 5 stages- signal acquisition, pre-processing, feature extraction, classification and CUDA inter- facing. It shall aid in serving a prototype for physical movement of neurological patients who are unable to control or operate on their muscular movements. All stages of the pipeline are designed to process bodily actions like eye blinks to command navigation of the robot. This prototype works on features learning and classification centric techniques using support vector machine. The suggested pipeline, ensures successful navigation of a robot in four directions in real time with accuracy of 93 percent

    Brain-Controlled Multi-Robot at Servo-Control Level Based on Nonlinear Model Predictive Control

    Get PDF
    Using a brain-computer interface (BCI) rather than limbs to control multiple robots (i.e., brain-controlled multi-robots) can better assist people with disabilities in daily life than a brain-controlled single robot. For example, one person with disabilities can move by a brain-controlled wheelchair (leader robot) and simultaneously transport objects by follower robots. In this paper, we explore how to control the direction, speed, and formation of a brain-controlled multi-robot system (consisting of leader and follower robots) for the first time and propose a novel multi-robot predictive control framework (MRPCF) that can track users' control intents and ensure the safety of multiple robots. The MRPCF consists of the leader controller, follower controller, and formation planner. We build a whole brain-controlled multi-robot physical system for the first time and test the proposed system through human-in-the-loop actual experiments. The experimental results indicate that the proposed system can track users' direction, speed, and formation control intents when guaranteeing multiple robots’ safety. This paper can promote the study of brain-controlled robots and multi-robot systems and provide some novel views into human-machine collaboration and integration

    Acceptability Study of A3-K3 Robotic Architecture for a Neurorobotics Painting

    Get PDF
    In this paper, authors present a novel architecture for controlling an industrial robot via Brain Computer Interface. The robot used is a Series 2000 KR 210-2. The robotic arm was fitted with DI drawing devices that clamp, hold and manipulate various artistic media like brushes, pencils, pens. User selected a high-level task, for instance a shape or movement, using a human machine interface and the translation in robot movement was entirely demanded to the Robot Control Architecture defining a plan to accomplish user's task. The architecture was composed by a Human Machine Interface based on P300 Brain Computer Interface and a robotic architecture composed by a deliberative layer and a reactive layer to translate user's high-level command in a stream of movement for robots joints. To create a real-case scenario, the architecture was presented at Ars Electronica Festival, where the A3-K3 architecture has been used for painting. Visitors completed a survey to address 4 self-assessed different dimensions related to human-robot interaction: the technology knowledge, the personal attitude, the innovativeness and the satisfaction. The obtained results have led to further exploring the border of human-robot interaction, highlighting the possibilities of human expression in the interaction process with a machine to create art

    Brain-Computer Interface meets ROS: A robotic approach to mentally drive telepresence robots

    Get PDF
    This paper shows and evaluates a novel approach to integrate a non-invasive Brain-Computer Interface (BCI) with the Robot Operating System (ROS) to mentally drive a telepresence robot. Controlling a mobile device by using human brain signals might improve the quality of life of people suffering from severe physical disabilities or elderly people who cannot move anymore. Thus, the BCI user can actively interact with relatives and friends located in different rooms thanks to a video streaming connection to the robot. To facilitate the control of the robot via BCI, we explore new ROS-based algorithms for navigation and obstacle avoidance in order to make the system safer and more reliable. In this regard, the robot exploits two maps of the environment, one for localization and one for navigation, and both are used as additional visual feedback for the BCI user to control the robot position. Experimental results show a decrease of the number of commands needed to complete the navigation task, suggesting a reduction user’s cognitive workload. The novelty of this work is to provide a first evidence of an integration between BCI and ROS that can simplify and foster the development of software for BCI driven robotics devices

    Brain-Computer Interface: comparison of two control modes to drive a virtual robot

    Get PDF
    A Brain-Computer Interface (BCI) is a system that enables communication and control that is not based on muscular movements, but on brain activity. Some of these systems are based on discrimination of different mental tasks; usually they match the number of mental tasks to the number of control commands. Previous research at the University of Málaga (UMA-BCI) have proposed a BCI system to freely control an external device, letting the subjects choose among several navigation commands using only one active mental task (versus any other mental activity). Although the navigation paradigm proposed in this system has been proved useful for continuous movements, if the user wants to move medium or large distances, he/she needs to keep the effort of the MI task in order to keep the command. In this way, the aim of this work was to test a navigation paradigm based on the brain-switch mode for ‘forward’ command. In this mode, the subjects used the mental task to switch their state on /off: they stopped if they were moving forward and vice versa. Initially, twelve healthy and untrained subjects participated in this study, but due to a lack of control in previous session, only four subjects participated in the experiment, in which they had to control a virtual robot using two paradigms: one based on continuous mode and another based on switch mode. Preliminary results show that both paradigms can be used to navigate through virtual environments, although with the first one the times needed to complete a path were notably lower.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
    corecore