1,964 research outputs found

    Implementation of safe human robot collaboration for ultrasound guided radiation therapy

    Get PDF
    This thesis shows that safe human-robot-interaction and Human Robot Collaboration is possible for Ultrasound (US) guided radiotherapy. Via the chosen methodology, all components (US, optical room monitoring and robot) could be linked and integrated and realized in a realistic clinical workflow. US guided radiotherapy offers a complement and alternative to existing image-guided therapy approaches. The real-time capability of US and high soft tissue contrast allow target structures to be tracked and radiation delivery to be modulated. However, Ultrasound guided radiation therapy (USgRT) is not yet clinically established but is still under development, as reliable and safe methods of image acquisition are not yet available. In particular, the loss of contact of the US probe to the patient surface poses a problem for patient movements such as breathing. For this purpose, a Breathing and motion compensation (BaMC) was developed in this work, which together with the safe control of a lightweight robot represents a new development for USgRT. The developed BaMC can be used to control the US probe with contact to the patient. The conducted experiments have confirmed that a steady contact with the patient surface and thus a continuous image acquisition can be ensured by the developed methodology. In addition, the image position in space can be accurately maintained in the submillimeter range. The BaMC seamlessly integrates into a developed clinical workflow. The graphical user interfaces developed for this purpose, as well as direct haptic control with the robot, provide an easy interaction option for the clinical user. The developed autonomous positioning of the transducer represents a good example of the feasibility of the approach. With the help of the user interface, an acoustic plane can be defined and autonomously approached via the robot in a time-efficient and precise manner. The tests carried out show that this methodology is suitable for a wide range of transducer positions. Safety in a human-robot interaction task is essential and requires individually customized concepts. In this work, adequate monitoring mechanisms could be found to ensure both patient and staff safety. In collision tests it could be shown that the implemented detection measures work and that the robot moves into a safe parking position. The forces acting on the patient could thus be pushed well below the limits required by the standard. This work has demonstrated the first important steps towards safe robot-assisted ultrasound imaging, which is not only applicable to USgRT. The developed interfaces provide the basis for further investigations in this field, especially in the area of image recognition, for example to determine the position of the target structure. With the proof of safety of the developed system, first study in human can now follow

    InterNAV3D: A Navigation Tool for Robot-Assisted Needle-Based Intervention for the Lung

    Get PDF
    Lung cancer is one of the leading causes of cancer deaths in North America. There are recent advances in cancer treatment techniques that can treat cancerous tumors, but require a real-time imaging modality to provide intraoperative assistive feedback. Ultrasound (US) imaging is one such modality. However, while its application to the lungs has been limited because of the deterioration of US image quality (due to the presence of air in the lungs); recent work has shown that appropriate lung deflation can help to improve the quality sufficiently to enable intraoperative, US-guided robotics-assisted techniques to be used. The work described in this thesis focuses on this approach. The thesis describes a project undertaken at Canadian Surgical Technologies and Advanced Robotics (CSTAR) that utilizes the image processing techniques to further enhance US images and implements an advanced 3D virtual visualization software approach. The application considered is that for minimally invasive lung cancer treatment using procedures such as brachytherapy and microwave ablation while taking advantage of the accuracy and teleoperation capabilities of surgical robots, to gain higher dexterity and precise control over the therapy tools (needles and probes). A number of modules and widgets are developed and explained which improve the visibility of the physical features of interest in the treatment and help the clinician to have more reliable and accurate control of the treatment. Finally the developed tools are validated with extensive experimental evaluations and future developments are suggested to enhance the scope of the applications

    Intraoperative Navigation Systems for Image-Guided Surgery

    Get PDF
    Recent technological advancements in medical imaging equipment have resulted in a dramatic improvement of image accuracy, now capable of providing useful information previously not available to clinicians. In the surgical context, intraoperative imaging provides a crucial value for the success of the operation. Many nontrivial scientific and technical problems need to be addressed in order to efficiently exploit the different information sources nowadays available in advanced operating rooms. In particular, it is necessary to provide: (i) accurate tracking of surgical instruments, (ii) real-time matching of images from different modalities, and (iii) reliable guidance toward the surgical target. Satisfying all of these requisites is needed to realize effective intraoperative navigation systems for image-guided surgery. Various solutions have been proposed and successfully tested in the field of image navigation systems in the last ten years; nevertheless several problems still arise in most of the applications regarding precision, usability and capabilities of the existing systems. Identifying and solving these issues represents an urgent scientific challenge. This thesis investigates the current state of the art in the field of intraoperative navigation systems, focusing in particular on the challenges related to efficient and effective usage of ultrasound imaging during surgery. The main contribution of this thesis to the state of the art are related to: Techniques for automatic motion compensation and therapy monitoring applied to a novel ultrasound-guided surgical robotic platform in the context of abdominal tumor thermoablation. Novel image-fusion based navigation systems for ultrasound-guided neurosurgery in the context of brain tumor resection, highlighting their applicability as off-line surgical training instruments. The proposed systems, which were designed and developed in the framework of two international research projects, have been tested in real or simulated surgical scenarios, showing promising results toward their application in clinical practice

    Patient-specific simulation environment for surgical planning and preoperative rehearsal

    Get PDF
    Surgical simulation is common practice in the fields of surgical education and training. Numerous surgical simulators are available from commercial and academic organisations for the generic modelling of surgical tasks. However, a simulation platform is still yet to be found that fulfils the key requirements expected for patient-specific surgical simulation of soft tissue, with an effective translation into clinical practice. Patient-specific modelling is possible, but to date has been time-consuming, and consequently costly, because data preparation can be technically demanding. This motivated the research developed herein, which addresses the main challenges of biomechanical modelling for patient-specific surgical simulation. A novel implementation of soft tissue deformation and estimation of the patient-specific intraoperative environment is achieved using a position-based dynamics approach. This modelling approach overcomes the limitations derived from traditional physically-based approaches, by providing a simulation for patient-specific models with visual and physical accuracy, stability and real-time interaction. As a geometrically- based method, a calibration of the simulation parameters is performed and the simulation framework is successfully validated through experimental studies. The capabilities of the simulation platform are demonstrated by the integration of different surgical planning applications that are found relevant in the context of kidney cancer surgery. The simulation of pneumoperitoneum facilitates trocar placement planning and intraoperative surgical navigation. The implementation of deformable ultrasound simulation can assist surgeons in improving their scanning technique and definition of an optimal procedural strategy. Furthermore, the simulation framework has the potential to support the development and assessment of hypotheses that cannot be tested in vivo. Specifically, the evaluation of feedback modalities, as a response to user-model interaction, demonstrates improved performance and justifies the need to integrate a feedback framework in the robot-assisted surgical setting.Open Acces

    Computer- and robot-assisted Medical Intervention

    Full text link
    Medical robotics includes assistive devices used by the physician in order to make his/her diagnostic or therapeutic practices easier and more efficient. This chapter focuses on such systems. It introduces the general field of Computer-Assisted Medical Interventions, its aims, its different components and describes the place of robots in that context. The evolutions in terms of general design and control paradigms in the development of medical robots are presented and issues specific to that application domain are discussed. A view of existing systems, on-going developments and future trends is given. A case-study is detailed. Other types of robotic help in the medical environment (such as for assisting a handicapped person, for rehabilitation of a patient or for replacement of some damaged/suppressed limbs or organs) are out of the scope of this chapter.Comment: Handbook of Automation, Shimon Nof (Ed.) (2009) 000-00

    Image guided robotic assistance for the diagnosis and treatment of tumor

    Get PDF
    The aim of this thesis is to demonstrate the feasibility and the potentiality of introduction of robotics and image guidance in the overall oncologic workflow, from the diagnosis to the treatment phase. The popularity of robotics in the operating room has grown in recent years. Currently the most popular systems is the da Vinci telemanipulator (Intuitive Surgical), it is based on a master-slave control, for minimally invasive surgery and it is used in several surgical fields such us urology, general, gynecology, cardiothoracic. An accurate study of this system, from a technological field of view, has been conducted addressing all drawbacks and advantages of this system. The da Vinci System creates an immersive operating environment for the surgeon by providing both high quality stereo visualization and a human-machine interface that directly connects the surgeon’s hands to the motion of the surgical tool tips inside the patient’s body. It has undoubted advantages for the surgeon work and for the patient health, at least for some interventions, while its very high costs leaves many doubts on its price benefit ratio. In the robotic surgery field many researchers are working on the optimization and miniaturization robots mechanic, while others are trying to obtain smart functionalities to realize robotic systems, that, “knowing” the patient anatomy from radiological images, can assists the surgeon in an active way. Regarding the second point, image guided systems can be useful to plan and to control medical robots motion and to provide the surgeon pre-operative and intra-operative images with augmented reality visualization to enhance his/her perceptual capacities and, as a consequence, to improve the quality of treatments. To demonstrate this thesis some prototypes has been designed, implemented and tested. The development of image guided medical devices, comprehensive of augmented reality, virtual navigation and robotic surgical features, requires to address several problems. The first ones are the choosing of the robotic platform and of the image source to employ. An industrial anthropomorphic arm has been used as testing platform. The idea of integrating industrial robot components in the clinical workflow has been supported by the da Vinci technical analysis. The algorithms and methods developed, regarding in particular robot calibration, based on literature theories and on an easily integration in the clinical scenario, can be adapted to each anthropomorphic arm. In this way this work can be integrated with light-weight robots, for industrial or clinical use, able to work in close contact to humans, which will become numerous in the early future. Regarding the medical image source, it has been decided to work with ultrasound imaging. Two-dimensional ultrasound imaging is widely used in clinical practice because is not dangerous for the patient, inexpensive, compact and it is a highly flexible imaging that allows users to study many anatomic structures. It is routinely used for diagnosis and as guidance in percutaneous treatments. However the use of 2D ultrasound imaging presents some disadvantages that require great ability of the user: it requires that the clinician mentally integrates many images to reconstruct a complete idea of the anatomy in 3D. Furthermore the freehand control of the probe make it difficult to individuate anatomic positions and orientations and probe repositioning to reach a particular location. To overcome these problems it has been developed an image guided system that fuse 2D US real time images with routinely CT or MRI 3D images, previously acquired from the patient, to enhance clinician orientation and probe guidance. The implemented algorithms for robot calibration and US image guidance has been used to realize two applications responding to specific clinical needs. The first one to speed up the execution of routinely and very recurrently procedures like percutaneous biopsy or ablation. The second one to improve a new completely non invasive type of treatment for solid tumors, the HIFU (High Intensity Focused Ultrasound). An ultrasound guided robotic system has been developed to assist the clinician to execute complicated biopsies, or percutaneous ablations, in particular for deep abdominal organs. It was developed an integrated system that provides the clinician two types of assistance: a mixed reality visualization allows accurate and easy planning of needle trajectory and target reaching verification; the robot arm equipped with a six-degree-of-freedom force sensor allows the precise positioning of the needle holder and allows the clinician to adjust, by means of a cooperative control, the planned trajectory to overcome needle deflection and target motion. The second application consists in an augmented reality navigation system for HIFU treatment. HIFU represents a completely non invasive method for treatment of solid tumors, hemostasis and other vascular features in human tissues. The technology for HIFU treatments is still evolving and the systems available on the market have some limitations and drawbacks. A disadvantage resulting from our experience with the machinery available in our hospital (JC200 therapeutic system Haifu (HIFU) by Tech Co., Ltd, Chongqing), which is similar to other analogous machines, is the long time required to perform the procedure due to the difficulty to find the target, using the remote motion of an ultrasound probe under the patient. This problem has been addressed developing an augmented reality navigation system to enhance US guidance during HIFU treatments allowing an easy target localization. The system was implemented using an additional free hand ultrasound probe coupled with a localizer and CT fused imaging. It offers a simple and an economic solution to an easy HIFU target localization. This thesis demonstrates the utility and usability of robots for diagnosis and treatment of the tumor, in particular the combination of automatic positioning and cooperative control allows the surgeon and the robot to work in synergy. Further the work demonstrates the feasibility and the potentiality of the use of a mixed reality navigation system to facilitate the target localization and consequently to reduce the times of sittings, to increase the number of possible diagnosis/treatments and to decrease the risk of potential errors. The proposed solutions for the integration of robotics and image guidance in the overall oncologic workflow, take into account current available technologies, traditional clinical procedures and cost minimization

    Modular framework for a breast biopsy smart navigation system

    Get PDF
    Dissertação de mestrado em Informatics EngineeringBreast cancer is currently one of the most commonly diagnosed cancers and the fifth leading cause of cancer-related deaths. Its treatment has a higher survivorship rate when diagnosed in the disease’s early stages. The screening procedure uses medical imaging techniques, such as mammography or ultrasound, to discover possible lesions. When a physician finds a lesion that is likely to be malignant, a biopsy is performed to obtain a sample and determine its characteristics. Currently, real-time ultrasound is the preferred medical imaging modality to perform this procedure. The breast biopsy procedure is highly reliant on the operator’s skill and experience, due to the difficulty in interpreting ultrasound images and correctly aiming the needle. Robotic solutions, and the usage of automatic lesion segmentation in ultrasound imaging along with advanced visualization techniques, such as augmented reality, can potentially make this process simpler, safer, and faster. The OncoNavigator project, in which this dissertation integrates, aims to improve the precision of the current breast cancer interventions. To accomplish this objective various medical training and robotic biopsy aid were developed. An augmented reality ultrasound training solution was created and the device’s tracking capabilities were validated by comparing it with an electromagnetic tracking device. Another solution for ultrasound-guided breast biopsy assisted with augmented reality was developed. This solution displays real-time ultrasound video, automatic lesion segmentation, and biopsy needle trajectory display in the user’s field of view. The validation of this solution was made by comparing its usability with the traditional procedure. A modular software framework was also developed that focuses on the integration of a collaborative medical robot with real-time ultrasound imaging and automatic lesion segmentation. Overall, the developed solutions offered good results. The augmented reality glasses tracking capabilities proved to be as capable as the electromagnetic system, and the augmented reality assisted breast biopsy proved to make the procedure more accurate and precise than the traditional system.O cancro da mama é, atualmente, um dos tipos de cancro mais comuns a serem diagnosticados e a quinta principal causa de mortes relacionadas ao cancro. O seu tratamento tem maior taxa de sobrevivência quando é diagnosticado nas fases iniciais da doença. O procedimento de triagem utiliza técnicas de imagem médica, como mamografia ou ultrassom, para descobrir possíveis lesões. Quando um médico encontra uma lesão com probabilidade de ser maligna, é realizada uma biópsia para obter uma amostra e determinar as suas características. O ultrassom em tempo real é a modalidade de imagem médica preferida para realizar esse procedimento. A biópsia mamária depende da habilidade e experiência do operador, devido à dificuldade de interpretação das imagens ultrassonográficas e ao direcionamento correto da agulha. Soluções robóticas, com o uso de segmentação automática de lesões em imagens de ultrassom, juntamente com técnicas avançadas de visualização, nomeadamente realidade aumentada, podem tornar esse processo mais simples, seguro e rápido. O projeto OncoNavigator, que esta dissertação integra, visa melhorar a precisão das atuais intervenções ao cancro da mama. Para atingir este objetivo, vários ajudas para treino médico e auxílio à biópsia por meio robótico foram desenvolvidas. Uma solução de treino de ultrassom com realidade aumentada foi criada e os recursos de rastreio do dispositivo foram validados comparando-os com um dispositivo eletromagnético. Outra solução para biópsia de mama guiada por ultrassom assistida com realidade aumentada foi desenvolvida. Esta solução exibe vídeo de ultrassom em tempo real, segmentação automática de lesões e exibição da trajetória da agulha de biópsia no campo de visão do utilizador. A validação desta solução foi feita comparando a sua usabilidade com o procedimento tradicional. Também foi desenvolvida uma estrutura de software modular que se concentra na integração de um robô médico colaborativo com imagens de ultrassom em tempo real e segmentação automática de lesões. Os recursos de rastreio dos óculos de realidade aumentada mostraram-se tão capazes quanto o sistema eletromagnético, e a biópsia de mama assistida por realidade aumentada provou tornar o procedimento mais exato e preciso do que o sistema tradicional
    corecore