43 research outputs found

    EEG classifier cross-task transfer to avoid training sessions in robot-assisted rehabilitation

    Full text link
    Background: For an individualized support of patients during rehabilitation, learning of individual machine learning models from the human electroencephalogram (EEG) is required. Our approach allows labeled training data to be recorded without the need for a specific training session. For this, the planned exoskeleton-assisted rehabilitation enables bilateral mirror therapy, in which movement intentions can be inferred from the activity of the unaffected arm. During this therapy, labeled EEG data can be collected to enable movement predictions of only the affected arm of a patient. Methods: A study was conducted with 8 healthy subjects and the performance of the classifier transfer approach was evaluated. Each subject performed 3 runs of 40 self-intended unilateral and bilateral reaching movements toward a target while EEG data was recorded from 64 channels. A support vector machine (SVM) classifier was trained under both movement conditions to make predictions for the same type of movement. Furthermore, the classifier was evaluated to predict unilateral movements by only beeing trained on the data of the bilateral movement condition. Results: The results show that the performance of the classifier trained on selected EEG channels evoked by bilateral movement intentions is not significantly reduced compared to a classifier trained directly on EEG data including unilateral movement intentions. Moreover, the results show that our approach also works with only 8 or even 4 channels. Conclusion: It was shown that the proposed classifier transfer approach enables motion prediction without explicit collection of training data. Since the approach can be applied even with a small number of EEG channels, this speaks for the feasibility of the approach in real therapy sessions with patients and motivates further investigations with stroke patients.Comment: 11 pages, 6 figures, 1 tabl

    Continuous ErrP detections during multimodal human-robot interaction

    Full text link
    Human-in-the-loop approaches are of great importance for robot applications. In the presented study, we implemented a multimodal human-robot interaction (HRI) scenario, in which a simulated robot communicates with its human partner through speech and gestures. The robot announces its intention verbally and selects the appropriate action using pointing gestures. The human partner, in turn, evaluates whether the robot's verbal announcement (intention) matches the action (pointing gesture) chosen by the robot. For cases where the verbal announcement of the robot does not match the corresponding action choice of the robot, we expect error-related potentials (ErrPs) in the human electroencephalogram (EEG). These intrinsic evaluations of robot actions by humans, evident in the EEG, were recorded in real time, continuously segmented online and classified asynchronously. For feature selection, we propose an approach that allows the combinations of forward and backward sliding windows to train a classifier. We achieved an average classification performance of 91% across 9 subjects. As expected, we also observed a relatively high variability between the subjects. In the future, the proposed feature selection approach will be extended to allow for customization of feature selection. To this end, the best combinations of forward and backward sliding windows will be automatically selected to account for inter-subject variability in classification performance. In addition, we plan to use the intrinsic human error evaluation evident in the error case by the ErrP in interactive reinforcement learning to improve multimodal human-robot interaction

    EEG and EMG dataset for the detection of errors introduced by an active orthosis device

    Full text link
    This paper presents a dataset containing recordings of the electroencephalogram (EEG) and the electromyogram (EMG) from eight subjects who were assisted in moving their right arm by an active orthosis device. The supported movements were elbow joint movements, i.e., flexion and extension of the right arm. While the orthosis was actively moving the subject's arm, some errors were deliberately introduced for a short duration of time. During this time, the orthosis moved in the opposite direction. In this paper, we explain the experimental setup and present some behavioral analyses across all subjects. Additionally, we present an average event-related potential analysis for one subject to offer insights into the data quality and the EEG activity caused by the error introduction. The dataset described herein is openly accessible. The aim of this study was to provide a dataset to the research community, particularly for the development of new methods in the asynchronous detection of erroneous events from the EEG. We are especially interested in the tactile and haptic-mediated recognition of errors, which has not yet been sufficiently investigated in the literature. We hope that the detailed description of the orthosis and the experiment will enable its reproduction and facilitate a systematic investigation of the influencing factors in the detection of erroneous behavior of assistive systems by a large community.Comment: Revised references to our datasets, general corrections to typos, and latex template format changes, Overall Content unchange

    Feel-Good Requirements: Neurophysiological and Psychological Design Criteria of Affective Touch for (Assistive) Robots

    Get PDF
    Previous research has shown the value of the sense of embodiment, i.e., being able to integrate objects into one's bodily self-representation, and its connection to (assistive) robotics. Especially, tactile interfaces seem essential to integrate assistive robots into one's body model. Beyond functional feedback, such as tactile force sensing, the human sense of touch comprises specialized nerves for affective signals, which transmit positive sensations during slow and low-force tactile stimulations. Since these signals are extremely relevant for body experience as well as social and emotional contacts but scarcely considered in recent assistive devices, this review provides a requirement analysis to consider affective touch in engineering design. By analyzing quantitative and qualitative information from engineering, cognitive psychology, and neuroscienctific research, requirements are gathered and structured. The resulting requirements comprise technical data such as desired motion or force/torque patterns and an evaluation of potential stimulation modalities as well as their relations to overall user experience, e.g., pleasantness and realism of the sensations. This review systematically considers the very specific characteristics of affective touch and the corresponding parts of the neural system to define design goals and criteria. Based on the analysis, design recommendations for interfaces mediating affective touch are derived. This includes a consideration of biological principles and human perception thresholds which are complemented by an analysis of technical possibilities. Finally, we outline which psychological factors can be satisfied by the mediation of affective touch to increase acceptance of assistive devices and outline demands for further research and development

    Feel-Good Requirements: Neurophysiological and Psychological Design Criteria of Affective Touch for (Assistive) Robots

    Get PDF
    Previous research has shown the value of the sense of embodiment, i.e., being able to integrate objects into one's bodily self-representation, and its connection to (assistive) robotics. Especially, tactile interfaces seem essential to integrate assistive robots into one's body model. Beyond functional feedback, such as tactile force sensing, the human sense of touch comprises specialized nerves for affective signals, which transmit positive sensations during slow and low-force tactile stimulations. Since these signals are extremely relevant for body experience as well as social and emotional contacts but scarcely considered in recent assistive devices, this review provides a requirement analysis to consider affective touch in engineering design. By analyzing quantitative and qualitative information from engineering, cognitive psychology, and neuroscienctific research, requirements are gathered and structured. The resulting requirements comprise technical data such as desired motion or force/torque patterns and an evaluation of potential stimulation modalities as well as their relations to overall user experience, e.g., pleasantness and realism of the sensations. This review systematically considers the very specific characteristics of affective touch and the corresponding parts of the neural system to define design goals and criteria. Based on the analysis, design recommendations for interfaces mediating affective touch are derived. This includes a consideration of biological principles and human perception thresholds which are complemented by an analysis of technical possibilities. Finally, we outline which psychological factors can be satisfied by the mediation of affective touch to increase acceptance of assistive devices and outline demands for further research and development

    Feel-good robotics: requirements on touch for embodiment in assistive robotics

    Get PDF
    The feeling of embodiment, i.e., experiencing the body as belonging to oneself and being able to integrate objects into one’s bodily self-representation, is a key aspect of human self-consciousness and has been shown to importantly shape human cognition. An extension of such feelings toward robots has been argued as being crucial for assistive technologies aiming at restoring, extending, or simulating sensorimotor functions. Empirical and theoretical work illustrates the importance of sensory feedback for the feeling of embodiment and also immersion; we focus on the the perceptual level of touch and the role of tactile feedback in various assistive robotic devices. We critically review how different facets of tactile perception in humans, i.e., affective, social, and self-touch, might influence embodiment. This is particularly important as current assistive robotic devices – such as prostheses, orthoses, exoskeletons, and devices for teleoperation–often limit touch low-density and spatially constrained haptic feedback, i.e., the mere touch sensation linked to an action. Here, we analyze, discuss, and propose how and to what degree tactile feedback might increase the embodiment of certain robotic devices, e.g., prostheses, and the feeling of immersion in human-robot interaction, e.g., in teleoperation. Based on recent findings from cognitive psychology on interactive processes between touch and embodiment, we discuss technical solutions for specific applications, which might be used to enhance embodiment, and facilitate the study of how embodiment might alter human-robot interactions. We postulate that high-density and large surface sensing and stimulation are required to foster embodiment of such assistive devices

    Deep and Surface Sensor Modalities for Myo-intent Detection

    Get PDF
    Electromyography is the gold-standard among sensors for prosthetic control. However, stable and reliable myocontrol remains an unsolved problem in the community. Amid improvements currently under investigation, one focuses on alternative or complementary sensors. In this study, we compare different techniques, recording surface and deep muscle activity. Ten subjects were involved in an experiment in which three different modalities were attached on their forearm: force myography, electro-impedance tomography and ultrasound. They were asked to perform wrist and grasp movements. For the first time, we evaluate and compare in an offline analysis these three different modalities while recording several hand gestures

    Exoskeleton Technology in Rehabilitation: Towards an EMG-Based Orthosis System for Upper Limb Neuromotor Rehabilitation

    Get PDF
    The rehabilitation of patients should not only be limited to the first phases during intense hospital care but also support and therapy should be guaranteed in later stages, especially during daily life activities if the patient’s state requires this. However, aid should only be given to the patient if needed and as much as it is required. To allow this, automatic self-initiated movement support and patient-cooperative control strategies have to be developed and integrated into assistive systems. In this work, we first give an overview of different kinds of neuromuscular diseases, review different forms of therapy, and explain possible fields of rehabilitation and benefits of robotic aided rehabilitation. Next, the mechanical design and control scheme of an upper limb orthosis for rehabilitation are presented. Two control models for the orthosis are explained which compute the triggering function and the level of assistance provided by the device. As input to the model fused sensor data from the orthosis and physiology data in terms of electromyography (EMG) signals are used

    'Embedded Brain Reading'

    Get PDF
    Current autonomous robots and interfaces are far from exhibiting the adaptability of biological beings regarding changes in their environment or during interaction. They are not always able to provide humans the best and a situation-specific support. Giving the robot or its interface insight into the human mind can open up new possibilities for the integration of human cognitive resources into robots and interfaces, i.e., into their intelligent control systems, and can particularly improve human-machine interaction. In this thesis embedded Brain Reading (eBR) is developed. It empowers a human-machine interface (HMI), which can be a robotic system, to infer the human's intention and hence her/his upcoming interaction behavior based on the context of the interaction and the human's brain state. To enable eBR, an automatic context recognition or generation as well as online, single-trial brain signal decoding, i.e., Brain Reading (BR) for the detection of specific brain states, are required. The human's electroencephalogram (EEG) recorded from the head's surface is used in this work as a measure of brain activity. Experiments are conducted in controlled experimental setups, where subjects have to execute differently complex and demanding simple and dual-task behavior as it is performed during human-machine interaction. Using these experiments the applicability and reliability of BR is confirmed as well as training procedures for BR are improved. Furthermore, a formal model for eBR is developed and shown to be applicable for different implementations of eBR. The formal model is the first step to check implementations of eBR for their correctness and completeness. By means of robotic applications for telemanipulation and rehabilitation it is further shown that eBR can be applied to either adapt or to drive HMIs, i.e., can be used to implement predictive HMIs for passive or active support. In case that eBR is applied for passive support, it is shown that malfunction of the whole system can be avoided. On the other hand, in case that eBR is applied for active support, i.e., to actively drive an HMI, it is shown that an individual adaptation of the support with respect to the requirements of different users can be facilitated by utilizing multi-modal signal analysis in eBR. Finally, it is shown that even in case of passive support eBR can measurably improve human-machine interaction
    corecore