69 research outputs found

    Brain computer interface based robotic rehabilitation with online modification of task speed

    Get PDF
    We present a systematic approach that enables online modification/adaptation of robot assisted rehabilitation exercises by continuously monitoring intention levels of patients utilizing an electroencephalogram (EEG) based Brain-Computer Interface (BCI). In particular, we use Linear Discriminant Analysis (LDA) to classify event-related synchronization (ERS) and desynchronization (ERD) patterns associated with motor imagery; however, instead of providing a binary classification output, we utilize posterior probabilities extracted from LDA classifier as the continuous-valued outputs to control a rehabilitation robot. Passive velocity field control (PVFC) is used as the underlying robot controller to map instantaneous levels of motor imagery during the movement to the speed of contour following tasks. In other words, PVFC changes the speed of contour following tasks with respect to intention levels of motor imagery. PVFC also allows decoupling of the task and the speed of the task from each other, and ensures coupled stability of the overall robot patient system. The proposed framework is implemented on AssistOn-Mobile - a series elastic actuator based on a holonomic mobile platform, and feasibility studies with healthy volunteers have been conducted test effectiveness of the proposed approach. Giving patients online control over the speed of the task, the proposed approach ensures active involvement of patients throughout exercise routines and has the potential to increase the efficacy of robot assisted therapies

    Autonomy Infused Teleoperation with Application to BCI Manipulation

    Full text link
    Robot teleoperation systems face a common set of challenges including latency, low-dimensional user commands, and asymmetric control inputs. User control with Brain-Computer Interfaces (BCIs) exacerbates these problems through especially noisy and erratic low-dimensional motion commands due to the difficulty in decoding neural activity. We introduce a general framework to address these challenges through a combination of computer vision, user intent inference, and arbitration between the human input and autonomous control schemes. Adjustable levels of assistance allow the system to balance the operator's capabilities and feelings of comfort and control while compensating for a task's difficulty. We present experimental results demonstrating significant performance improvement using the shared-control assistance framework on adapted rehabilitation benchmarks with two subjects implanted with intracortical brain-computer interfaces controlling a seven degree-of-freedom robotic manipulator as a prosthetic. Our results further indicate that shared assistance mitigates perceived user difficulty and even enables successful performance on previously infeasible tasks. We showcase the extensibility of our architecture with applications to quality-of-life tasks such as opening a door, pouring liquids from containers, and manipulation with novel objects in densely cluttered environments

    Hand and Arm Gesture-based Human-Robot Interaction: A Review

    Full text link
    The study of Human-Robot Interaction (HRI) aims to create close and friendly communication between humans and robots. In the human-center HRI, an essential aspect of implementing a successful and effective HRI is building a natural and intuitive interaction, including verbal and nonverbal. As a prevalent nonverbally communication approach, hand and arm gesture communication happen ubiquitously in our daily life. A considerable amount of work on gesture-based HRI is scattered in various research domains. However, a systematic understanding of the works on gesture-based HRI is still lacking. This paper intends to provide a comprehensive review of gesture-based HRI and focus on the advanced finding in this area. Following the stimulus-organism-response framework, this review consists of: (i) Generation of human gesture(stimulus). (ii) Robot recognition of human gesture(organism). (iii) Robot reaction to human gesture(response). Besides, this review summarizes the research status of each element in the framework and analyze the advantages and disadvantages of related works. Toward the last part, this paper discusses the current research challenges on gesture-based HRI and provides possible future directions.Comment: 10 pages, 1 figure

    Smart Camera Robotic Assistant for Laparoscopic Surgery

    Get PDF
    The cognitive architecture also includes learning mechanisms to adapt the behavior of the robot to the different ways of working of surgeons, and to improve the robot behavior through experience, in a similar way as a human assistant would do. The theoretical concepts of this dissertation have been validated both through in-vitro experimentation in the labs of medical robotics of the University of Malaga and through in-vivo experimentation with pigs in the IACE Center (Instituto Andaluz de Cirugía Experimental), performed by expert surgeons.In the last decades, laparoscopic surgery has become a daily practice in operating rooms worldwide, which evolution is tending towards less invasive techniques. In this scenario, robotics has found a wide field of application, from slave robotic systems that replicate the movements of the surgeon to autonomous robots able to assist the surgeon in certain maneuvers or to perform autonomous surgical tasks. However, these systems require the direct supervision of the surgeon, and its capacity of making decisions and adapting to dynamic environments is very limited. This PhD dissertation presents the design and implementation of a smart camera robotic assistant to collaborate with the surgeon in a real surgical environment. First, it presents the design of a novel camera robotic assistant able to augment the capacities of current vision systems. This robotic assistant is based on an intra-abdominal camera robot, which is completely inserted into the patient’s abdomen and it can be freely moved along the abdominal cavity by means of magnetic interaction with an external magnet. To provide the camera with the autonomy of motion, the external magnet is coupled to the end effector of a robotic arm, which controls the shift of the camera robot along the abdominal wall. This way, the robotic assistant proposed in this dissertation has six degrees of freedom, which allow providing a wider field of view compared to the traditional vision systems, and also to have different perspectives of the operating area. On the other hand, the intelligence of the system is based on a cognitive architecture specially designed for autonomous collaboration with the surgeon in real surgical environments. The proposed architecture simulates the behavior of a human assistant, with a natural and intuitive human-robot interface for the communication between the robot and the surgeon

    Proceedings of the 1st Standardized Knowledge Representation and Ontologies for Robotics and Automation Workshop

    Get PDF
    Welcome to IEEE-ORA (Ontologies for Robotics and Automation) IROS workshop. This is the 1st edition of the workshop on! Standardized Knowledge Representation and Ontologies for Robotics and Automation. The IEEE-ORA 2014 workshop was held on the 18th September, 2014 in Chicago, Illinois, USA. In!the IEEE-ORA IROS workshop, 10 contributions were presented from 7 countries in North and South America, Asia and Europe. The presentations took place in the afternoon, from 1:30 PM to 5:00 PM. The first session was dedicated to “Standards for Knowledge Representation in Robotics”, where presentations were made from the IEEE working group standards for robotics and automation, and also from the ISO TC 184/SC2/WH7. The second session was dedicated to “Core and Application Ontologies”, where presentations were made for core robotics ontologies, and also for industrial and robot assisted surgery ontologies. Three posters were presented in emergent applications of ontologies in robotics. We would like to express our thanks to all participants. First of all to the authors, whose quality work is the essence of this workshop. Next, to all the members of the international program committee, who helped us with their expertise and valuable time. We would also like to deeply thank the IEEE-IROS 2014 organizers for hosting this workshop. Our deep gratitude goes to the IEEE Robotics and Automation Society, that sponsors! the IEEE-ORA group activities, and also to the scientific organizations that kindly agreed to sponsor all the workshop authors work

    Are preferences useful for better assistance? A physically assistive robotics user study

    Get PDF
    © 2021 Copyright held by the owner/author(s).Assistive Robots have an inherent need of adapting to the user they are assisting. This is crucial for the correct development of the task, user safety, and comfort. However, adaptation can be performed in several manners. We believe user preferences are key to this adaptation. In this paper, we evaluate the use of preferences for Physically Assistive Robotics tasks in a Human-Robot Interaction user evaluation. Three assistive tasks have been implemented consisting of assisted feeding, shoe-fitting, and jacket dressing, where the robot performs each task in a different manner based on user preferences. We assess the ability of the users to determine which execution of the task used their chosen preferences (if any). The obtained results show that most of the users were able to successfully guess the cases where their preferences were used even when they had not seen the task before. We also observe that their satisfaction with the task increases when the chosen preferences are employed. Finally, we also analyze the user’s opinions regarding assistive tasks and preferences, showing promising expectations as to the benefits of adapting the robot behavior to the user through preferences.This work has been supported by the ERC project Clothilde (ERC-2016-ADG-741930), the HuMoUR project (Spanish Ministry of Science and Innovation TIN2017-90086-R) and by the Spanish State Research Agency through the MarĂ­a de Maeztu Seal of Excellence to IRI (MDM-2016-0656). Gerard Canal has also been supported by the Spanish Ministry of Education, Culture and Sport by the FPU15/00504 doctoral grant and the CHIST-ERA project COHERENT (EPSRC EP/V062506/1).Peer ReviewedPostprint (published version

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    Enhancement of Robot-Assisted Rehabilitation Outcomes of Post-Stroke Patients Using Movement-Related Cortical Potential

    Get PDF
    Post-stroke rehabilitation is essential for stroke survivors to help them regain independence and to improve their quality of life. Among various rehabilitation strategies, robot-assisted rehabilitation is an efficient method that is utilized more and more in clinical practice for motor recovery of post-stroke patients. However, excessive assistance from robotic devices during rehabilitation sessions can make patients perform motor training passively with minimal outcome. Towards the development of an efficient rehabilitation strategy, it is necessary to ensure the active participation of subjects during training sessions. This thesis uses the Electroencephalography (EEG) signal to extract the Movement-Related Cortical Potential (MRCP) pattern to be used as an indicator of the active engagement of stroke patients during rehabilitation training sessions. The MRCP pattern is also utilized in designing an adaptive rehabilitation training strategy that maximizes patients’ engagement. This project focuses on the hand motor recovery of post-stroke patients using the AMADEO rehabilitation device (Tyromotion GmbH, Austria). AMADEO is specifically developed for patients with fingers and hand motor deficits. The variations in brain activity are analyzed by extracting the MRCP pattern from the acquired EEG data during training sessions. Whereas, physical improvement in hand motor abilities is determined by two methods. One is clinical tests namely Fugl-Meyer Assessment (FMA) and Motor Assessment Scale (MAS) which include FMA-wrist, FMA-hand, MAS-hand movements, and MAS-advanced hand movements’ tests. The other method is the measurement of hand-kinematic parameters using the AMADEO assessment tool which contains hand strength measurements during flexion (force-flexion), and extension (force-extension), and Hand Range of Movement (HROM)

    Design and analysis of a brain-computer interface-based robotic rehabilitation system

    Get PDF
    In this thesis, we have investigated the effect of brain-computer interfaces (BCI) which enable direct communication between a brain and a computer, to increase the patient's active involvement to his/her task in the robotic rehabilitation therapy. We have designed several experimental paradigms using electroencephalography (EEG) based BCIs which can be used to extract information about arm movement imagery in the context of robotic rehabilitation experiments. In particular, we propose a protocol that extracts and uses information about the level of intention of the subject to control the robot continuously throughout a rehabilitation experiment. In this context we have developed and implemented EEG signal processing, learning and classiffication algorithms for o ine and online decision-making. We have used di erent types of controlling methods over the robotic system and examined the potential impact of BCI on rehabilitation, the effect of robotic haptic feedback on BCI, and information contained in EEG about the rehabilitation process. Our results verify that the use of haptic feedback through robotic movement improves BCI performance. We also observe that using BCI continuously in the experiment rather than only to trigger robotic movement may be preferable. Finally, our results indicate stronger motor imagery activity in BCI-based experiments over conventional experiments in which movement is performed by the robot without the subject's involvement
    • 

    corecore