574 research outputs found

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Human-machine interfaces based on EMG and EEG applied to robotic systems

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Two different Human-Machine Interfaces (HMIs) were developed, both based on electro-biological signals. One is based on the EMG signal and the other is based on the EEG signal. Two major features of such interfaces are their relatively simple data acquisition and processing systems, which need just a few hardware and software resources, so that they are, computationally and financially speaking, low cost solutions. Both interfaces were applied to robotic systems, and their performances are analyzed here. The EMG-based HMI was tested in a mobile robot, while the EEG-based HMI was tested in a mobile robot and a robotic manipulator as well.</p> <p>Results</p> <p>Experiments using the EMG-based HMI were carried out by eight individuals, who were asked to accomplish ten eye blinks with each eye, in order to test the eye blink detection algorithm. An average rightness rate of about 95% reached by individuals with the ability to blink both eyes allowed to conclude that the system could be used to command devices. Experiments with EEG consisted of inviting 25 people (some of them had suffered cases of meningitis and epilepsy) to test the system. All of them managed to deal with the HMI in only one training session. Most of them learnt how to use such HMI in less than 15 minutes. The minimum and maximum training times observed were 3 and 50 minutes, respectively.</p> <p>Conclusion</p> <p>Such works are the initial parts of a system to help people with neuromotor diseases, including those with severe dysfunctions. The next steps are to convert a commercial wheelchair in an autonomous mobile vehicle; to implement the HMI onboard the autonomous wheelchair thus obtained to assist people with motor diseases, and to explore the potentiality of EEG signals, making the EEG-based HMI more robust and faster, aiming at using it to help individuals with severe motor dysfunctions.</p

    Tongue Control of Upper-Limb Exoskeletons For Individuals With Tetraplegia

    Get PDF

    Unconventional Cognitive Intelligent Robotic Control: Quantum Soft Computing Approach in Human Being Emotion Estimation -- QCOptKB Toolkit Application

    Full text link
    Strategy of intelligent cognitive control systems based on quantum and soft computing presented. Quantum self-organization knowledge base synergetic effect extracted from intelligent fuzzy controllers imperfect knowledge bases described. That technology improved of robustness of intelligent cognitive control systems in hazard control situations described with the cognitive neuro-interface and different types of robot cooperation. Examples demonstrated the introduction of quantum fuzzy inference gate design as prepared programmable algorithmic solution for board embedded control systems. The possibility of neuro-interface application based on cognitive helmet with quantum fuzzy controller for driving of the vehicle is shown

    Autonomous Grasping of 3-D Objects by a Vision-Actuated Robot Arm using Brain-Computer Interface

    Get PDF
    A major drawback of a Brain–Computer Interface-based robotic manipulation is the complex trajectory planning of the robot arm to be carried out by the user for reaching and grasping an object. The present paper proposes an intelligent solution to the existing problem by incorporating a novel Convolutional Neural Network (CNN)-based grasp detection network that enables the robot to reach and grasp the desired object (including overlapping objects) autonomously using a RGB-D camera. This network uses a simultaneous object and grasp detection to affiliate each estimated grasp with its corresponding object. The subject uses motor imagery brain signals to control the pan and tilt angle of a RGB-D camera mounted on a robot link to bring the desired object inside its Field-of-view presented through a display screen while the objects appearing on the screen are selected using the P300 brain pattern. The robot uses inverse kinematics along with the RGB-D camera information to autonomously reach the selected object and the object is grasped using proposed grasping strategy. The overall BCI system outperforms other comparative systems involving manual trajectory planning significantly. The overall accuracy, steady-state error, and settling time of the proposed system are 93.4%, 0.05%, and 15.92 s, respectively. The system also shows a significant reduction of the workload of the operating subjects in comparison to manual trajectory planning based approaches for reaching and grasping

    Rehabilitation Technologies: Biomechatronics Point of View

    Get PDF

    Human Machine Interfaces for Teleoperators and Virtual Environments

    Get PDF
    In Mar. 1990, a meeting organized around the general theme of teleoperation research into virtual environment display technology was conducted. This is a collection of conference-related fragments that will give a glimpse of the potential of the following fields and how they interplay: sensorimotor performance; human-machine interfaces; teleoperation; virtual environments; performance measurement and evaluation methods; and design principles and predictive models
    corecore