2 research outputs found

    Implicit Gaze-Assisted Adaptive Motion Scaling for Highly Articulated Instrument Manipulation

    No full text
    Traditional robotic surgical systems rely entirely on robotic arms to triangulate articulated instruments inside the human anatomy. This configuration can be ill-suited for working in tight spaces or during single access approaches, where little to no triangulation between the instrument shafts is possible. The control of these instruments is further obstructed by ergonomic issues: The presence of motion scaling imposes the use of clutching mechanics to avoid the workspace limitations of master devices, and forces the user to choose between slow, precise movements, or fast, less accurate ones. This paper presents a bi-manual system using novel self-triangulating 6-degrees-of-freedom (DoF) tools through a flexible elbow, which are mounted on robotic arms. The control scheme for the resulting 9-DoF system is detailed, with particular emphasis placed on retaining maximum dexterity close to joint limits. Furthermore, this paper introduces the concept of gaze-assisted adaptive motion scaling. By combining eye tracking with hand motion and instrument information, the system is capable of inferring the user's destination and modifying the motion scaling accordingly. This safe, novel approach allows the user to quickly reach distant locations while retaining full precision for delicate manoeuvres. The performance and usability of this adaptive motion scaling is evaluated in a user study, showing a clear improvement in task completion speed and in the reduction of the need for clutching
    corecore