1,978 research outputs found

    Collaborative Gaze Channelling for Improved Cooperation During Robotic Assisted Surgery

    Get PDF
    The use of multiple robots for performing complex tasks is becoming a common practice for many robot applications. When different operators are involved, effective cooperation with anticipated manoeuvres is important for seamless, synergistic control of all the end-effectors. In this paper, the concept of Collaborative Gaze Channelling (CGC) is presented for improved control of surgical robots for a shared task. Through eye tracking, the fixations of each operator are monitored and presented in a shared surgical workspace. CGC permits remote or physically separated collaborators to share their intention by visualising the eye gaze of their counterparts, and thus recovers, to a certain extent, the information of mutual intent that we rely upon in a vis-à-vis working setting. In this study, the efficiency of surgical manipulation with and without CGC for controlling a pair of bimanual surgical robots is evaluated by analysing the level of coordination of two independent operators. Fitts' law is used to compare the quality of movement with or without CGC. A total of 40 subjects have been recruited for this study and the results show that the proposed CGC framework exhibits significant improvement (p<0.05) on all the motion indices used for quality assessment. This study demonstrates that visual guidance is an implicit yet effective way of communication during collaborative tasks for robotic surgery. Detailed experimental validation results demonstrate the potential clinical value of the proposed CGC framework. © 2012 Biomedical Engineering Society.link_to_subscribed_fulltex

    Robotic and clinical evaluation of upper limb motor performance in patients with Friedreich's Ataxia: an observational study

    Get PDF
    Background: Friedreich’s ataxia (FRDA) is the most common hereditary autosomal recessive form of ataxia. In this disease there is early manifestation of gait ataxia, and dysmetria of the arms and legs which causes impairment in daily activities that require fine manual dexterity. To date there is no cure for this disease. Some novel therapeutic approaches are ongoing in different steps of clinical trial. Development of sensitive outcome measures is crucial to prove therapeutic effectiveness. The aim of the study was to assess the reliability and sensitivity of quantitative and objective assessment of upper limb performance computed by means of the robotic device and to evaluate the correlation with clinical and functional markers of the disease severity. Methods: Here we assess upper limb performances by means of the InMotion Arm Robot, a robot designed for clinical neurological applications, in a cohort of 14 children and young adults affected by FRDA, matched for age and gender with 18 healthy subjects. We focused on the analysis of kinematics, accuracy, smoothness, and submovements of the upper limb while reaching movements were performed. The robotic evaluation of upper limb performance consisted of planar reaching movements performed with the robotic system. The motors of the robot were turned off, so that the device worked as a measurement tool. The status of the disease was scored using the Scale for the Assessment and Rating of Ataxia (SARA). Relationships between robotic indices and a range of clinical and disease characteristics were examined. Results: All our robotic indices were significantly different between the two cohorts except for two, and were highly and reliably discriminative between healthy and subjects with FRDA. In particular, subjects with FRDA exhibited slower movements as well as loss of accuracy and smoothness, which are typical of the disease. Duration of Movement, Normalized Jerk, and Number of Submovements were the best discriminative indices, as they were directly and easily measurable and correlated with the status of the disease, as measured by SARA. Conclusions: Our results suggest that outcome measures obtained by means of robotic devices can improve the sensitivity of clinical evaluations of patients’ dexterity and can accurately and efficiently quantify changes over time in clinical trials, particularly when functional scales appear to be no longer sensitive

    Sensory Manipulation as a Countermeasure to Robot Teleoperation Delays: System and Evidence

    Full text link
    In the field of robotics, robot teleoperation for remote or hazardous environments has become increasingly vital. A major challenge is the lag between command and action, negatively affecting operator awareness, performance, and mental strain. Even with advanced technology, mitigating these delays, especially in long-distance operations, remains challenging. Current solutions largely focus on machine-based adjustments. Yet, there's a gap in using human perceptions to improve the teleoperation experience. This paper presents a unique method of sensory manipulation to help humans adapt to such delays. Drawing from motor learning principles, it suggests that modifying sensory stimuli can lessen the perception of these delays. Instead of introducing new skills, the approach uses existing motor coordination knowledge. The aim is to minimize the need for extensive training or complex automation. A study with 41 participants explored the effects of altered haptic cues in delayed teleoperations. These cues were sourced from advanced physics engines and robot sensors. Results highlighted benefits like reduced task time and improved perceptions of visual delays. Real-time haptic feedback significantly contributed to reduced mental strain and increased confidence. This research emphasizes human adaptation as a key element in robot teleoperation, advocating for improved teleoperation efficiency via swift human adaptation, rather than solely optimizing robots for delay adjustment.Comment: Submitted to Scientific Report

    An Overview of Self-Adaptive Technologies Within Virtual Reality Training

    Get PDF
    This overview presents the current state-of-the-art of self-adaptive technologies within virtual reality (VR) training. Virtual reality training and assessment is increasingly used for five key areas: medical, industrial & commercial training, serious games, rehabilitation and remote training such as Massive Open Online Courses (MOOCs). Adaptation can be applied to five core technologies of VR including haptic devices, stereo graphics, adaptive content, assessment and autonomous agents. Automation of VR training can contribute to automation of actual procedures including remote and robotic assisted surgery which reduces injury and improves accuracy of the procedure. Automated haptic interaction can enable tele-presence and virtual artefact tactile interaction from either remote or simulated environments. Automation, machine learning and data driven features play an important role in providing trainee-specific individual adaptive training content. Data from trainee assessment can form an input to autonomous systems for customised training and automated difficulty levels to match individual requirements. Self-adaptive technology has been developed previously within individual technologies of VR training. One of the conclusions of this research is that while it does not exist, an enhanced portable framework is needed and it would be beneficial to combine automation of core technologies, producing a reusable automation framework for VR training

    The New Jersey Institute of Technology Robot-Assisted Virtual Rehabilitation (NJIT-RAVR) system for children with cerebral palsy: a feasibility study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>We hypothesize that the integration of virtual reality (VR) with robot assisted rehabilitation could be successful if applied to children with hemiparetic CP. The combined benefits of increased attention provided by VR and the larger training stimulus afforded by adaptive robotics may increase the beneficial effects of these two approaches synergistically. This paper will describe the NJIT-RAVR system, which combines adaptive robotics with complex VR simulations for the rehabilitation of upper extremity impairments and function in children with CP and examine the feasibility of this system in the context of a two subject training study.</p> <p>Methods</p> <p>The NJIT-RAVR system consists of the Haptic Master, a 6 degrees of freedom, admittance controlled robot and a suite of rehabilitation simulations that provide adaptive algorithms for the Haptic Master, allowing the user to interact with rich virtual environments. Two children, a ten year old boy and a seven year old girl, both with spastic hemiplegia secondary to Cerebral Palsy were recruited from the outpatient center of a comprehensive pediatric rehabilitation facility. Subjects performed a battery of clinical testing and kinematic measurements of reaching collected by the NJIT-RAVR system. Subjects trained with the NJIT-RAVR System for one hour, 3 days a week for three weeks. The subjects played a combination of four or five simulations depending on their therapeutic goals, tolerances and preferences. Games were modified to increase difficulty in order to challenge the subjects as their performance improved. The testing battery was repeated following the training period.</p> <p>Results</p> <p>Both participants completed 9 hours of training in 3 weeks. No untoward events occurred and no adverse responses to treatment or complaints of cyber sickness were reported. One participant showed improvements in overall performance on the functional aspects of the testing battery. The second subject made improvements in upper extremity active range of motion and in kinematic measures of reaching movements.</p> <p>Conclusion</p> <p>We feel that this study establishes the feasibility of integrating robotics and rich virtual environments to address functional limitations and decreased motor performance in children with mild to moderate cerebral palsy.</p

    Haptic Bimanual System for Teleoperation of Time-Delayed Tasks

    Get PDF

    A survey of haptics in serious gaming

    Get PDF
    Serious gaming often requires high level of realism for training and learning purposes. Haptic technology has been proved to be useful in many applications with an additional perception modality complementary to the audio and the vision. It provides novel user experience to enhance the immersion of virtual reality with a physical control-layer. This survey focuses on the haptic technology and its applications in serious gaming. Several categories of related applications are listed and discussed in details, primarily on haptics acts as cognitive aux and main component in serious games design. We categorize haptic devices into tactile, force feedback and hybrid ones to suit different haptic interfaces, followed by description of common haptic gadgets in gaming. Haptic modeling methods, in particular, available SDKs or libraries either for commercial or academic usage, are summarized. We also analyze the existing research difficulties and technology bottleneck with haptics and foresee the future research directions
    • 

    corecore