3 research outputs found

    Control and Estimation Methods Towards Safe Robot-assisted Eye Surgery

    Get PDF
    Vitreoretinal surgery is among the most delicate surgical tasks in which physiological hand tremor may severely diminish surgeon performance and put the eye at high risk of injury. Unerring targeting accuracy is required to perform precise operations on micro-scale tissues. Tool tip to tissue interaction forces are usually below human tactile perception, which may result in exertion of excessive forces to the retinal tissue leading to irreversible damages. Notable challenges during retinal surgery lend themselves to robotic assistance which has proven beneficial in providing a safe steady-hand manipulation. Efficient assistance from the robots heavily relies on accurate sensing and intelligent control algorithms of important surgery states and situations (e.g. instrument tip position measurements and control of interaction forces). This dissertation provides novel control and state estimation methods to improve safety during robot-assisted eye surgery. The integration of robotics into retinal microsurgery leads to a reduction in surgeon perception of tool-to-tissue forces at sclera. This blunting of human tactile sensory input, which is due to the inflexible inertia of the robot, is a potential iatrogenic risk during robotic eye surgery. To address this issue, a sensorized surgical instrument equipped with Fiber Bragg Grating (FBG) sensors, which is capable of measuring the sclera forces and instrument insertion depth into the eye, is integrated to the Steady-Hand Eye Robot (SHER). An adaptive control scheme is then customized and implemented on the robot that is intended to autonomously mitigate the risk of unsafe scleral forces and excessive insertion of the instrument. Various preliminary and multi-user clinician studies are then conducted to evaluate the effectiveness of the control method during mock retinal surgery procedures. In addition, due to inherent flexibility and the resulting deflection of eye surgical instruments as well as the need for targeting accuracy, we have developed a method to enhance deflected instrument tip position estimation. Using an iterative method and microscope data, we develop a calibration- and registration-independent (RI) framework to provide online estimates of the instrument stiffness (least squares and adaptive). The estimations are then combined with a state-space model for tip position evolution obtained based on the forward kinematics (FWK) of the robot and FBG sensor measurements. This is accomplished using a Kalman Filtering (KF) approach to improve the instrument tip position estimation during robotic surgery. The entire framework is independent of camera-to-robot coordinate frame registration and is evaluated during various phantom experiments to demonstrate its effectiveness
    corecore