70 research outputs found

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Brain-computer interface for robot control with eye artifacts for assistive applications

    Get PDF
    Human-robot interaction is a rapidly developing field and robots have been taking more active roles in our daily lives. Patient care is one of the fields in which robots are becoming more present, especially for people with disabilities. People with neurodegenerative disorders might not consciously or voluntarily produce movements other than those involving the eyes or eyelids. In this context, Brain-Computer Interface (BCI) systems present an alternative way to communicate or interact with the external world. In order to improve the lives of people with disabilities, this paper presents a novel BCI to control an assistive robot with user's eye artifacts. In this study, eye artifacts that contaminate the electroencephalogram (EEG) signals are considered a valuable source of information thanks to their high signal-to-noise ratio and intentional generation. The proposed methodology detects eye artifacts from EEG signals through characteristic shapes that occur during the events. The lateral movements are distinguished by their ordered peak and valley formation and the opposite phase of the signals measured at F7 and F8 channels. This work, as far as the authors' knowledge, is the first method that used this behavior to detect lateral eye movements. For the blinks detection, a double-thresholding method is proposed by the authors to catch both weak blinks as well as regular ones, differentiating itself from the other algorithms in the literature that normally use only one threshold. Real-time detected events with their virtual time stamps are fed into a second algorithm, to further distinguish between double and quadruple blinks from single blinks occurrence frequency. After testing the algorithm offline and in realtime, the algorithm is implemented on the device. The created BCI was used to control an assistive robot through a graphical user interface. The validation experiments including 5 participants prove that the developed BCI is able to control the robot

    EEG-based brain-computer interfaces using motor-imagery: techniques and challenges.

    Get PDF
    Electroencephalography (EEG)-based brain-computer interfaces (BCIs), particularly those using motor-imagery (MI) data, have the potential to become groundbreaking technologies in both clinical and entertainment settings. MI data is generated when a subject imagines the movement of a limb. This paper reviews state-of-the-art signal processing techniques for MI EEG-based BCIs, with a particular focus on the feature extraction, feature selection and classification techniques used. It also summarizes the main applications of EEG-based BCIs, particularly those based on MI data, and finally presents a detailed discussion of the most prevalent challenges impeding the development and commercialization of EEG-based BCIs

    The SSSA-MyHand: a dexterous lightweight myoelectric hand prosthesis

    Get PDF
    The replacement of a missing hand by a prosthesis is one of the most fascinating challenges in rehabilitation engineering. State of art prostheses are curtailed by the physical features of the hand, like poor functionality and excessive weight. Here we present a new multi-grasp hand aimed at overcoming such limitations. The SSSA-MyHand builds around a novel transmission mechanism that implements a semi-independent actuation of the abduction/adduction of the thumb and of the flexion/extension of the index, by means of a single actuator. Thus, with only three electric motors the hand is capable to perform most of the grasps and gestures useful in activities of daily living, akin commercial prostheses with up to six actuators, albeit it is as lightweight as conventional 1-Degrees of Freedom prostheses. The hand integrates position and force sensors and an embedded controller that implements automatic grasps and allows inter-operability with different human-machine interfaces. We present the requirements, the design rationale of the first prototype and the evaluation of its performance. The weight (478 g), force (31 N maximum force at the thumb fingertip) and speed of the hand (closing time: <370 ms), make this new design an interesting alternative to clinically available multi-grasp prostheses

    Enhancement of Robot-Assisted Rehabilitation Outcomes of Post-Stroke Patients Using Movement-Related Cortical Potential

    Get PDF
    Post-stroke rehabilitation is essential for stroke survivors to help them regain independence and to improve their quality of life. Among various rehabilitation strategies, robot-assisted rehabilitation is an efficient method that is utilized more and more in clinical practice for motor recovery of post-stroke patients. However, excessive assistance from robotic devices during rehabilitation sessions can make patients perform motor training passively with minimal outcome. Towards the development of an efficient rehabilitation strategy, it is necessary to ensure the active participation of subjects during training sessions. This thesis uses the Electroencephalography (EEG) signal to extract the Movement-Related Cortical Potential (MRCP) pattern to be used as an indicator of the active engagement of stroke patients during rehabilitation training sessions. The MRCP pattern is also utilized in designing an adaptive rehabilitation training strategy that maximizes patients’ engagement. This project focuses on the hand motor recovery of post-stroke patients using the AMADEO rehabilitation device (Tyromotion GmbH, Austria). AMADEO is specifically developed for patients with fingers and hand motor deficits. The variations in brain activity are analyzed by extracting the MRCP pattern from the acquired EEG data during training sessions. Whereas, physical improvement in hand motor abilities is determined by two methods. One is clinical tests namely Fugl-Meyer Assessment (FMA) and Motor Assessment Scale (MAS) which include FMA-wrist, FMA-hand, MAS-hand movements, and MAS-advanced hand movements’ tests. The other method is the measurement of hand-kinematic parameters using the AMADEO assessment tool which contains hand strength measurements during flexion (force-flexion), and extension (force-extension), and Hand Range of Movement (HROM)

    The electromyography, electronics and sensory system for a mechatronics integrated touch hand 3.

    Get PDF
    Master of Science in Mechatronics Engineering. University of KwaZulu-Natal, Durban 2015.This research presents the EMG, electronics and sensory system for a mechatronics integrated Touch Hand 3. This unique myoelectric hand was driven by EMG signals captured at the surface of the skin in order to achieve a more robust grasp. The research consisted of different configurations and types of electrodes to be used for an EMG device, with the research analysis of the different candidates to control the Touch Hand 3. In addition, the EMG experimental results to compare contact and non-contact electrodes were carried out to find a correlation between the EMG electrodes and an antenna. These results determined the number of layers that the EMG sensor will need to obtain the best reading in a patch-yagi antenna. Stick-on electrodes were used to monitor two muscle groups in the arm at the same time. Contact and non-contact electrode tests were conducted, which used a combination of the embroidery electrodes and the stick-on electrodes. The flexion-extension muscles were tested in the experiments, by letting each volunteer lift a 2.5 kg weight. The electronics and sensor system had to be researched, designed, developed and optimized to allow for a successful integration of the Touch Hand 3. The artificial arm is fitted with palpable sensors to read the object temperature, force, and vibration. A number of constraints were considered in designing the system including the modularity of the system, cost, weight, and its grip strength in comparison with the Touch Hand 2. The modular electrical system was designed to accommodate full control and integration with the mechanical system to form a myoelectric mechatronics prosthetic system to be used by the amputees more effectively in a quicker response

    Gaze-tracking-based interface for robotic chair guidance

    Get PDF
    This research focuses on finding solutions to enhance the quality of life for wheelchair users, specifically by applying a gaze-tracking-based interface for the guidance of a robotized wheelchair. For this purpose, the interface was applied in two different approaches for the wheelchair control system. The first one was an assisted control in which the user was continuously involved in controlling the movement of the wheelchair in the environment and the inclination of the different parts of the seat through the user’s gaze and eye blinks obtained with the interface. The second approach was to take the first steps to apply the device to an autonomous wheelchair control in which the wheelchair moves autonomously avoiding collisions towards the position defined by the user. To this end, the basis for obtaining the gaze position relative to the wheelchair and the object detection was developed in this project to be able to calculate in the future the optimal route to which the wheelchair should move. In addition, the integration of a robotic arm in the wheelchair to manipulate different objects was also considered, obtaining in this work the object of interest indicated by the user's gaze within the detected objects so that in the future the robotic arm could select and pick up the object the user wants to manipulate. In addition to the two approaches, an attempt was also made to estimate the user's gaze without the software interface. For this purpose, the gaze is obtained from pupil detection libraries, a calibration and a mathematical model that relates pupil positions to gaze. The results of the implementations have been analysed in this work, including some limitations encountered. Nevertheless, future improvements are proposed, with the aim of increasing the independence of wheelchair user

    TRAINING AND ASSESSMENT OF HAND-EYE COORDINATION WITH ELECTROENCEPHALOGRAPHY

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH
    corecore