14 research outputs found

    Control and Estimation Methods Towards Safe Robot-assisted Eye Surgery

    Get PDF
    Vitreoretinal surgery is among the most delicate surgical tasks in which physiological hand tremor may severely diminish surgeon performance and put the eye at high risk of injury. Unerring targeting accuracy is required to perform precise operations on micro-scale tissues. Tool tip to tissue interaction forces are usually below human tactile perception, which may result in exertion of excessive forces to the retinal tissue leading to irreversible damages. Notable challenges during retinal surgery lend themselves to robotic assistance which has proven beneficial in providing a safe steady-hand manipulation. Efficient assistance from the robots heavily relies on accurate sensing and intelligent control algorithms of important surgery states and situations (e.g. instrument tip position measurements and control of interaction forces). This dissertation provides novel control and state estimation methods to improve safety during robot-assisted eye surgery. The integration of robotics into retinal microsurgery leads to a reduction in surgeon perception of tool-to-tissue forces at sclera. This blunting of human tactile sensory input, which is due to the inflexible inertia of the robot, is a potential iatrogenic risk during robotic eye surgery. To address this issue, a sensorized surgical instrument equipped with Fiber Bragg Grating (FBG) sensors, which is capable of measuring the sclera forces and instrument insertion depth into the eye, is integrated to the Steady-Hand Eye Robot (SHER). An adaptive control scheme is then customized and implemented on the robot that is intended to autonomously mitigate the risk of unsafe scleral forces and excessive insertion of the instrument. Various preliminary and multi-user clinician studies are then conducted to evaluate the effectiveness of the control method during mock retinal surgery procedures. In addition, due to inherent flexibility and the resulting deflection of eye surgical instruments as well as the need for targeting accuracy, we have developed a method to enhance deflected instrument tip position estimation. Using an iterative method and microscope data, we develop a calibration- and registration-independent (RI) framework to provide online estimates of the instrument stiffness (least squares and adaptive). The estimations are then combined with a state-space model for tip position evolution obtained based on the forward kinematics (FWK) of the robot and FBG sensor measurements. This is accomplished using a Kalman Filtering (KF) approach to improve the instrument tip position estimation during robotic surgery. The entire framework is independent of camera-to-robot coordinate frame registration and is evaluated during various phantom experiments to demonstrate its effectiveness

    Augmentation Of Human Skill In Microsurgery

    Get PDF
    Surgeons performing highly skilled microsurgery tasks can benefit from information and manual assistance to overcome technological and physiological limitations to make surgery safer, efficient, and more successful. Vitreoretinal surgery is particularly difficult due to inherent micro-scale and fragility of human eye anatomy. Additionally, surgeons are challenged by physiological hand tremor, poor visualization, lack of force sensing, and significant cognitive load while executing high-risk procedures inside the eye, such as epiretinal membrane peeling. This dissertation presents the architecture and the design principles for a surgical augmentation environment which is used to develop innovative functionality to address the fundamental limitations in vitreoretinal surgery. It is an inherently information driven modular system incorporating robotics, sensors, and multimedia components. The integrated nature of the system is leveraged to create intuitive and relevant human-machine interfaces and generate a particular system behavior to provide active physical assistance and present relevant sensory information to the surgeon. These include basic manipulation assistance, audio-visual and haptic feedback, intraoperative imaging and force sensing. The resulting functionality, and the proposed architecture and design methods generalize to other microsurgical procedures. The system's performance is demonstrated and evaluated using phantoms and in vivo experiments

    Realidade aumentada num simulador virtual de tomada de decisão clínica

    Get PDF
    A presente investigação teve como objetivo verificar se a Realidade Aumentada (RA) potencia o desenvolvimento de competências de tomada de decisão clínica no diagnóstico e tratamento de feridas crónicas, aumenta a motivação dos estudantes e a usabilidade do simulador virtual e-FER. O e-FER é um simulador online de tomada de decisão clínica utilizado na formação inicial de enfermeiros, permitindo simular o diagnóstico e tratamento de casos clínicos virtuais de feridas crónicas. Para o presente estudo foi acrescentada uma componente de RA com novos casos clínicos, no sentido de investigar os seus efeitos na motivação, usabilidade e desenvolvimento de competências no diagnóstico e tratamento de feridas crónicas. Desenvolveu-se um estudo quase-experimental com uma amostra de 54 estudantes a frequentar o primeiro ano de Enfermagem. Realizou-se uma análise comparativa entre o desempenho dos grupos experimental (que utilizou o e-FER tradicional e depois com RA) e de controlo (que utilizou apenas o e-FER tradicional), com base nos dados extraídos a partir do simulador virtual e-FER. Os dados relativos à motivação dos estudantes e usabilidade do sistema foram recolhidos através de questionário. Os resultados obtidos permitem concluir que a RA potenciou o desempenho dos estudantes, particularmente nos parâmetros de diagnóstico da ferida, com diferenças estatisticamente muito significativas (p<0,001) nos testes de Mann-Whitney U e Wilcoxon, registando-se ainda índices elevados de motivação e usabilidade do simulador, mesmo com a introdução de um dispositivo adicional na atividade.The goal of this investigation was to verify if Augmented Reality (AR) enhances the development of clinical decision-making skills in wound diagnosis and treatment, increases student motivation and the usability of the e-FER virtual simulator. The e-FER is an online clinical decision-making simulator used in the initial training of nurses, allowing to simulate the diagnosis and treatment of virtual clinical cases of chronic wounds. In this study an AR component was added with new clinical cases, in order to investigate its effects on motivation, usability and the development of wound diagnosis and treatment skills. A quasi-experimental study was conducted with a sample of 54 students attending the first year of a Nursing program. A comparative analysis between the progress of the experimental group (who used the traditional e-FER and then with AR) and the control group (who used only the traditional e-FER) was made using the data extracted from the e-FER virtual simulator. Data on student's motivation and system usability were collected through a questionnaire. The results showed that AR enhanced student performance, particularly in wound diagnostic parameters, with highly statistically significant differences (p<0,001) in the Mann-Whitney U and Wilcoxon tests, and registering high levels of motivation and simulator usability, even with the introduction of an additional device in the activity
    corecore