2,996 research outputs found
Medical image computing and computer-aided medical interventions applied to soft tissues. Work in progress in urology
Until recently, Computer-Aided Medical Interventions (CAMI) and Medical
Robotics have focused on rigid and non deformable anatomical structures.
Nowadays, special attention is paid to soft tissues, raising complex issues due
to their mobility and deformation. Mini-invasive digestive surgery was probably
one of the first fields where soft tissues were handled through the development
of simulators, tracking of anatomical structures and specific assistance
robots. However, other clinical domains, for instance urology, are concerned.
Indeed, laparoscopic surgery, new tumour destruction techniques (e.g. HIFU,
radiofrequency, or cryoablation), increasingly early detection of cancer, and
use of interventional and diagnostic imaging modalities, recently opened new
challenges to the urologist and scientists involved in CAMI. This resulted in
the last five years in a very significant increase of research and developments
of computer-aided urology systems. In this paper, we propose a description of
the main problems related to computer-aided diagnostic and therapy of soft
tissues and give a survey of the different types of assistance offered to the
urologist: robotization, image fusion, surgical navigation. Both research
projects and operational industrial systems are discussed
Tactile Sensing System for Lung Tumour Localization during Minimally Invasive Surgery
Video-assisted thoracoscopie surgery (VATS) is becoming a prevalent method for lung cancer treatment. However, VATS suffers from the inability to accurately relay haptic information to the surgeon, often making tumour localization difficult. This limitation was addressed by the design of a tactile sensing system (TSS) consisting of a probe with a tactile sensor and interfacing visualization software. In this thesis, TSS performance was tested to determine the feasibility of implementing the system in VATS. This was accomplished through a series of ex vivo experiments in which the tactile sensor was calibrated and the visualization software was modified to provide haptic information visually to the user, and TSS performance was compared using human and robot palpation methods, and conventional VATS instruments. It was concluded that the device offers the possibility of providing to the surgeon the haptic information lost during surgery, thereby mitigating one of the current limitations of VATS
An Open-Source 7-Axis, Robotic Platform to Enable Dexterous Procedures within CT Scanners
This paper describes the design, manufacture, and performance of a highly
dexterous, low-profile, 7 Degree-of-Freedom (DOF) robotic arm for CT-guided
percutaneous needle biopsy. Direct CT guidance allows physicians to localize
tumours quickly; however, needle insertion is still performed by hand. This
system is mounted to a fully active gantry superior to the patient's head and
teleoperated by a radiologist. Unlike other similar robots, this robot's fully
serial-link approach uses a unique combination of belt and cable drives for
high-transparency and minimal-backlash, allowing for an expansive working area
and numerous approach angles to targets all while maintaining a small in-bore
cross-section of less than . Simulations verified the system's
expansive collision free work-space and ability to hit targets across the
entire chest, as required for lung cancer biopsy. Targeting error is on average
on a teleoperated accuracy task, illustrating the system's sufficient
accuracy to perform biopsy procedures. The system is designed for lung biopsies
due to the large working volume that is required for reaching peripheral lung
lesions, though, with its large working volume and small in-bore
cross-sectional area, the robotic system is effectively a general-purpose
CT-compatible manipulation device for percutaneous procedures. Finally, with
the considerable development time undertaken in designing a precise and
flexible-use system and with the desire to reduce the burden of other
researchers in developing algorithms for image-guided surgery, this system
provides open-access, and to the best of our knowledge, is the first
open-hardware image-guided biopsy robot of its kind.Comment: 8 pages, 9 figures, final submission to IROS 201
Temporal Segmentation of Surgical Sub-tasks through Deep Learning with Multiple Data Sources
Many tasks in robot-assisted surgeries (RAS) can be represented by finite-state machines (FSMs), where each state represents either an action (such as picking up a needle) or an observation (such as bleeding). A crucial step towards the automation of such surgical tasks is the temporal perception of the current surgical scene, which requires a real-time estimation of the states in the FSMs. The objective of this work is to estimate the current state of the surgical task based on the actions performed or events occurred as the task progresses. We propose Fusion-KVE, a unified surgical state estimation model that incorporates multiple data sources including the Kinematics, Vision, and system Events. Additionally, we examine the strengths and weaknesses of different state estimation models in segmenting states with different representative features or levels of granularity. We evaluate our model on the JHU-ISI Gesture and Skill Assessment Working Set (JIGSAWS), as well as a more complex dataset involving robotic intra-operative ultrasound (RIOUS) imaging, created using the da Vinci® Xi surgical system. Our model achieves a superior frame-wise state estimation accuracy up to 89.4%, which improves the state-of-the-art surgical state estimation models in both JIGSAWS suturing dataset and our RIOUS dataset
User-centred design and evaluation of a tele-operated echocardiography robot
We present the collected findings of a user-centred approach for developing a tele-operated robot for remote echocardiography examinations. During the three-year development of the robot, we involved users in all development stages of the robot, to increase the usability of the system for the doctors. For requirement compilation, we conducted a literature review, observed two traditional examinations, arranged focus groups with doctors and patients, and conducted two online surveys. During the development of the robot, we regularly involved doctors in usability tests to receive feedback from them on the user interface for the robot and on the robot’s hardware. For evaluation of the robot, we conducted two eye tracking studies. In the first study, doctors executed a traditional echocardiography examination. In the second study, the doctors conducted a remote examination with our robot. The results of the studies show that all doctors were able to successfully complete a correct ultrasonography examination with the tele-operated robot. In comparison to a traditional examination, the doctors on average only need a short amount of additional time to successfully examine a patient when using our remote echocardiography robot. The results also show that the doctors fixate considerably more often, but with shorter fixation times, on the USG screen in the traditional examination compared to the remote examination. We found further that some of the user-centred design methods we applied had to be adjusted to the clinical context and the hectic schedule of the doctors. Overall, our experience and results suggest that the usage of user-centred design methodology is well suited for developing medical robots and leads to a usable product that meets the end users’ needs
Implementation of safe human robot collaboration for ultrasound guided radiation therapy
This thesis shows that safe human-robot-interaction and Human Robot Collaboration is possible for Ultrasound (US) guided radiotherapy. Via the chosen methodology, all components (US, optical room monitoring and robot) could be linked and integrated and realized in a realistic clinical workflow.
US guided radiotherapy offers a complement and alternative to existing image-guided therapy approaches. The real-time capability of US and high soft tissue contrast allow target structures to be tracked and radiation delivery to be modulated. However, Ultrasound guided radiation therapy (USgRT) is not yet clinically established but is still under development, as reliable and safe methods of image acquisition are not yet available. In particular, the loss of contact of the US probe to the patient surface poses a problem for patient movements such as breathing.
For this purpose, a Breathing and motion compensation (BaMC) was developed in this work, which together with the safe control of a lightweight robot represents a new development for USgRT. The developed BaMC can be used to control the US probe with contact to the patient. The conducted experiments have confirmed that a steady contact with the patient surface and thus a continuous image acquisition can be ensured by the developed methodology. In addition, the image position in space can be accurately maintained in the submillimeter range.
The BaMC seamlessly integrates into a developed clinical workflow. The graphical user interfaces developed for this purpose, as well as direct haptic control with the robot, provide an easy interaction option for the clinical user. The developed autonomous positioning of the transducer represents a good example of the feasibility of the approach. With the help of the user interface, an acoustic plane can be defined and autonomously approached via the robot in a time-efficient and precise manner. The tests carried out show that this methodology is suitable for a wide range of transducer positions.
Safety in a human-robot interaction task is essential and requires individually customized concepts. In this work, adequate monitoring mechanisms could be found to ensure both patient and staff safety. In collision tests it could be shown that the implemented detection measures work and that the robot moves into a safe parking position. The forces acting on the patient could thus be pushed well below the limits required by the standard.
This work has demonstrated the first important steps towards safe robot-assisted ultrasound imaging, which is not only applicable to USgRT. The developed interfaces provide the basis for further investigations in this field, especially in the area of image recognition, for example to determine the position of the target structure. With the proof of safety of the developed system, first study in human can now follow
Computer- and robot-assisted Medical Intervention
Medical robotics includes assistive devices used by the physician in order to
make his/her diagnostic or therapeutic practices easier and more efficient.
This chapter focuses on such systems. It introduces the general field of
Computer-Assisted Medical Interventions, its aims, its different components and
describes the place of robots in that context. The evolutions in terms of
general design and control paradigms in the development of medical robots are
presented and issues specific to that application domain are discussed. A view
of existing systems, on-going developments and future trends is given. A
case-study is detailed. Other types of robotic help in the medical environment
(such as for assisting a handicapped person, for rehabilitation of a patient or
for replacement of some damaged/suppressed limbs or organs) are out of the
scope of this chapter.Comment: Handbook of Automation, Shimon Nof (Ed.) (2009) 000-00
- …