7 research outputs found

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Desenvolvimento de uma Interface Cérebro-Computador Não Invasiva Baseada em Potenciais Evocados Visuais de Regime Permanente Aplicada à Comunicação Alternativa e Robô de Telepresença

    Get PDF
    Uma parcela da população é composta por pessoas que são acometidas de doenças ou vítimas de acidentes graves que as impossibilitam de interagir e se comunicar. Novas tecnologias têm surgido para prover a essas pessoas um canal de comunicação alternativo através de sinais cerebrais. Esses sistemas são conhecidos como Interfaces Cérebro-Computador (ICCs). Este trabalho descreve o desenvolvimento de uma ICC baseada no paradigma de Potenciais Evocados Visuais de Regime Permanente (Steady State Visual Evoked Potential - SSVEP) aplicada à Comunicação Alternativa e Robô de Telepresença. A interface foi construída para quatro comandos de seleção atráves de estímulos visuais desenvolvidos em um software utilizando a biblioteca gráfica OpenGL e executados em frequências distintas (5,6Hz, 6,4Hz, 6,9Hz e 8,0Hz). Todos os voluntários avaliados nos testes utilizando o sistema online conseguiram completar as tarefas propostas com uma taxa de acerto média de 88,3% ± 5,4%, tempo de classificação de 5,6s ± 0,5s e ITR média de 14,2 bits/min ± 3,5 bits/min, não necessitando de treinamento e utilizando apenas um canal para aquisição do sinal eletroencefalográfico. Os resultados demonstraram a possibilidade da construção de uma ICC que poderá ser utilizada nos futuros projetos de tecnologias assistivas desenvolvidos no Laboratório de Automação Inteligente da Universidade Federal do Espírito Santo (LAI-UFES)

    Untersuchung eines hybriden Brain-Computer Interfaces (BCIs) zur optimalen Auslegung als Mensch-Maschine-Schnittstelle

    Get PDF
    People with partial or complete paralysis are usually dependent on nursing staff to cope with their daily activities. Alternatively, an assistance robotics system serves as a substitute for the lost motor skills and enables an improvement in the quality of life. With the help of a human-machine interface, the user can operate the system and initiate autonomous processes. If the system consists of a robot arm, the disabled person is able to grab and move objects. In the case of an error caused by the system, the user should be able to control the robot arm directly. With the interface developed in this work, a human being can open or close as well as move the robot gripper into any position and orientation. Since the target group of the system is people with very limited body movements, a Brain-Computer Interface (BCI) is used for communication. It establishes a communication channel between the human brain and the controlled robot arm. A special helmet is used for recording the brain signals, which, in addition to the measuring electrodes, has a stimulator in the peripheral field of vision of the user. This stimulator has four stimuli by which an evocation of event-related and visual potentials are provided. By detecting Steady State Visual Evoked Potentials (SSVEPs) and P300 potentials, the system can determine which stimulus the user has focused on. With this, four discrete commands are provided for the user to control the robot. For switching the stimulator on and off motor imaginations are used. A state machine with a control group for each translational and rotational movement is used to control the robot. A stepwise control, in which the gripper performs discrete steps, and a speed-based control, in which the movement of the gripper is started and stopped has been implemented. Switching between groups is performed by the sequential sending of two commands. In each group, the user can perform a movement in the positive and negative direction of the selected axis, change the step width or speed, and leave the group. Stopping at the target position and in dangerous situations occurs by closing the eyes and the thereby occurring artifacts and alpha waves in the signals. Both control concepts were tested in a study and evaluated with objective and subjective criteria. The experiments have shown that BCI-based robot arm control is feasible and useful

    sBCI-Headset—Wearable and Modular Device for Hybrid Brain-Computer Interface

    No full text
    Severely disabled people, like completely paralyzed persons either with tetraplegia or similar disabilities who cannot use their arms and hands, are often considered as a user group of Brain Computer Interfaces (BCI). In order to achieve high acceptance of the BCI by this user group and their supporters, the BCI system has to be integrated into their support infrastructure. Critical disadvantages of a BCI are the time consuming preparation of the user for the electroencephalography (EEG) measurements and the low information transfer rate of EEG based BCI. These disadvantages become apparent if a BCI is used to control complex devices. In this paper, a hybrid BCI is described that enables research for a Human Machine Interface (HMI) that is optimally adapted to requirements of the user and the tasks to be carried out. The solution is based on the integration of a Steady-state visual evoked potential (SSVEP)-BCI, an Event-related (de)-synchronization (ERD/ERS)-BCI, an eye tracker, an environmental observation camera, and a new EEG head cap for wearing comfort and easy preparation. The design of the new fast multimodal BCI (called sBCI) system is described and first test results, obtained in experiments with six healthy subjects, are presented. The sBCI concept may also become useful for healthy people in cases where a “hands-free” handling of devices is necessary
    corecore