2,404 research outputs found

    Virtual reality training and assessment in laparoscopic rectum surgery

    Get PDF
    Background: Virtual-reality (VR) based simulation techniques offer an efficient and low cost alternative to conventional surgery training. This article describes a VR training and assessment system in laparoscopic rectum surgery. Methods: To give a realistic visual performance of interaction between membrane tissue and surgery tools, a generalized cylinder based collision detection and a multi-layer mass-spring model are presented. A dynamic assessment model is also designed for hierarchy training evaluation. Results: With this simulator, trainees can operate on the virtual rectum with both visual and haptic sensation feedback simultaneously. The system also offers surgeons instructions in real time when improper manipulation happens. The simulator has been tested and evaluated by ten subjects. Conclusions: This prototype system has been verified by colorectal surgeons through a pilot study. They believe the visual performance and the tactile feedback are realistic. It exhibits the potential to effectively improve the surgical skills of trainee surgeons and significantly shorten their learning curve. © 2014 John Wiley & Sons, Ltd

    Haptics in Robot-Assisted Surgery: Challenges and Benefits

    Get PDF
    Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts

    A Virtual-Based Haptic Endoscopic Sinus Surgery (ESS) Training System: from Development to Validation

    Full text link
    Simulated training platforms offer a suitable avenue for surgical students and professionals to build and improve upon their skills, without the hassle of traditional training methods. To enhance the degree of realistic interaction paradigms of training simulators, great work has been done to both model simulated anatomy in more realistic fashion, as well as providing appropriate haptic feedback to the trainee. As such, this chapter seeks to discuss the ongoing research being conducted on haptic feedback-incorporated simulators specifically for Endoscopic Sinus Surgery (ESS). This chapter offers a brief comparative analysis of some EES simulators, in addition to a deeper quantitative and qualitative look into our approach to designing and prototyping a complete virtual-based haptic EES training platform

    Surgeon Training in Telerobotic Surgery via a Hardware-in-the-Loop Simulator

    Get PDF

    Da Vinci robot at Hospital Clinic. Manoeuvrability devices and performance in robotic tech

    Get PDF
    Treballs Finals de Grau d'Enginyeria Biomèdica. Facultat de Medicina i Ciències de la Salut. Universitat de Barcelona. Curs: 2020-2021. Tutor: Manel Puig Vidal.Robot-assisted surgical systems are becoming increasingly common in medical procedures as they embrace many of the benefits of minimally invasive surgery including less trauma, recovery time and financial costs associated to the treatment after surgery. These robotic systems allow the surgeons to navigate within confined spaces where an operator’s human hand would normally be greatly limited. This dexterity is further strengthened through motion scaling, which translates large motions by the operator into diminutive actions of the robotic end effector. An example of this is the Da Vinci System which is coupled to the EndoWrist end effector tool. Nevertheless, these systems also have some drawbacks such as the high cost of the surgery itself and the lack of tactile or haptic feedback. This means that as the surgeon is performing the procedures outside the patient’s body, he/she can not feel the resistance of the human tissue’s when cutting. Therefore, one can risk damaging healthy tissues if force is not controlled or, when sewing, one can exert an exaggerated force and break the thread. In this project, a new system is created based on the UR5 robot (Universal Robots) and an EndoWrist needle to mimic the behaviour of the Da Vinci System and implement some improvements regarding the manoeuvrability and haptic feedback performance

    Virtuaalitodellisuuteen perustuva robottikäsivarren ohjausjärjestelmä

    Get PDF
    Virtual reality (VR) is a rising technology for creating previously unseen human machine interfaces. At the same time, general purpose robotic arms are becoming more common in various use cases. The goal of this thesis is to develop a prototype system, which enables controlling a robotic arm remotely utilizing VR technology. To limit the scope, the system is intended for future research in remote medical applications such as diagnosis and surgery. Furthermore, the system will be used especially for researching safety, security and network aspects. The developed prototype system comprises of a Universal Robots UR3 robotic arm and HTC Vive VR system. To study the functionality of the prototype system, two use cases are described and tested. In the first use case, the robotic arm is used to remotely pick up tools and move them on an operation table. In the second use case, the robotic arm is used for remote medical observations and diagnosis with a video feedback link. The conclusion is that a VR controlled robotic system definitely has potential in medical applications. Several possibilities are already studied in this thesis, but the system would need further improvement to reach the level of safe usability required for such critical applications. Moreover, this kind of control system appears to provide unexpected possibilities, such as continuous identification and training artificial intelligence robotic systems, but further research is required. The system is a holistic combination of mechanical engineering, mechatronics and computer science, which is a source of complexity as the tools, processes and software used in different engineering areas do not always fit together well.Virtuaalitodellisuus (VR) on teknologia, jolla voidaan luoda ennennäkemättömiä rajapintoja koneiden ja ihmisten välille. Samanaikaisesti yleiskäyttöiset robottikäsivarret yleistyvät eri käyttötarkoituksissa. Työn tavoite on kehittää prototyyppijärjestelmä, jossa robottikäsivartta etäohjataan virtuaalitodellisuuden avulla. Prototyyppijärjestelmän käyttötarkoitus on uudenlaisten etänä tapahtuvien lääketieteellisten sovellusten tutkimus. Erityisesti tarkoitus on tutkia turvallisuuteen, tietoturvaan ja tietoverkkoihin liittyviä ongelmia ja ratkaisuja. Prototyyppijärjestelmä koostuu Universal Robots UR3 -robottikäsivarresta sekä HTC Vive -VR-järjestelmästä. Prototyyppijärjestelmän toimintaa tutkitaan kahden käyttötapauksen avulla. Ensimmäisessä käyttötapauksessa robotilla poimitaan ja liikutellaan työkaluja leikkauspöydällä etäohjatusti. Toisessa käyttötapauksessa robottia käytetään etädiagnoosiin ja lääketieteellisten mittausten tekemiseen videolinkin avulla. Työn tulos on, että virtuaalitodellisuuden avulla voidaan etäohjata robottikäsivartta intuitiivisesti ja että tällaisella järjestelmällä on mahdollisuuksia lääketieteellisissä sovelluksissa. Työssä tutkitaan useita erilaisia mahdollisuuksia, mutta järjestelmä ei vielä ole riittävän kehittynyt turvallisesti käytettäväksi tällaisissa kriittisissä sovelluksissa. Lisäksi järjestelmä näyttää mahdollistavan odottamattomia lisäominaisuuksia etäohjauksen lisäksi. Esimerkiksi jatkuva käyttäjän tunnistus ja keinoälyn opettaminen ovat mahdollisia jatkotutkimusaiheita. Kehitetty järjestelmä yhdistää konetekniikkaa, mekatroniikkaa ja tietojenkäsittelytiedettä, mikä osaltaan lisää järjestelmän monimutkaisuutta, sillä eri aloilla käytetyt menetelmät, ohjelmistot ja työkalut eivät sovi aina yhteen

    Medical image computing and computer-aided medical interventions applied to soft tissues. Work in progress in urology

    Full text link
    Until recently, Computer-Aided Medical Interventions (CAMI) and Medical Robotics have focused on rigid and non deformable anatomical structures. Nowadays, special attention is paid to soft tissues, raising complex issues due to their mobility and deformation. Mini-invasive digestive surgery was probably one of the first fields where soft tissues were handled through the development of simulators, tracking of anatomical structures and specific assistance robots. However, other clinical domains, for instance urology, are concerned. Indeed, laparoscopic surgery, new tumour destruction techniques (e.g. HIFU, radiofrequency, or cryoablation), increasingly early detection of cancer, and use of interventional and diagnostic imaging modalities, recently opened new challenges to the urologist and scientists involved in CAMI. This resulted in the last five years in a very significant increase of research and developments of computer-aided urology systems. In this paper, we propose a description of the main problems related to computer-aided diagnostic and therapy of soft tissues and give a survey of the different types of assistance offered to the urologist: robotization, image fusion, surgical navigation. Both research projects and operational industrial systems are discussed

    Recent Advancements in Augmented Reality for Robotic Applications: A Survey

    Get PDF
    Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement

    An Open-Source 7-Axis, Robotic Platform to Enable Dexterous Procedures within CT Scanners

    Full text link
    This paper describes the design, manufacture, and performance of a highly dexterous, low-profile, 7 Degree-of-Freedom (DOF) robotic arm for CT-guided percutaneous needle biopsy. Direct CT guidance allows physicians to localize tumours quickly; however, needle insertion is still performed by hand. This system is mounted to a fully active gantry superior to the patient's head and teleoperated by a radiologist. Unlike other similar robots, this robot's fully serial-link approach uses a unique combination of belt and cable drives for high-transparency and minimal-backlash, allowing for an expansive working area and numerous approach angles to targets all while maintaining a small in-bore cross-section of less than 16cm216cm^2. Simulations verified the system's expansive collision free work-space and ability to hit targets across the entire chest, as required for lung cancer biopsy. Targeting error is on average <1mm<1mm on a teleoperated accuracy task, illustrating the system's sufficient accuracy to perform biopsy procedures. The system is designed for lung biopsies due to the large working volume that is required for reaching peripheral lung lesions, though, with its large working volume and small in-bore cross-sectional area, the robotic system is effectively a general-purpose CT-compatible manipulation device for percutaneous procedures. Finally, with the considerable development time undertaken in designing a precise and flexible-use system and with the desire to reduce the burden of other researchers in developing algorithms for image-guided surgery, this system provides open-access, and to the best of our knowledge, is the first open-hardware image-guided biopsy robot of its kind.Comment: 8 pages, 9 figures, final submission to IROS 201

    An Asynchronous Simulation Framework for Multi-User Interactive Collaboration: Application to Robot-Assisted Surgery

    Get PDF
    The field of surgery is continually evolving as there is always room for improvement in the post-operative health of the patient as well as the comfort of the Operating Room (OR) team. While the success of surgery is contingent upon the skills of the surgeon and the OR team, the use of specialized robots has shown to improve surgery-related outcomes in some cases. These outcomes are currently measured using a wide variety of metrics that include patient pain and recovery, surgeon’s comfort, duration of the operation and the cost of the procedure. There is a need for additional research to better understand the optimal criteria for benchmarking surgical performance. Presently, surgeons are trained to perform robot-assisted surgeries using interactive simulators. However, in the absence of well-defined performance standards, these simulators focus primarily on the simulation of the operative scene and not the complexities associated with multiple inputs to a real-world surgical procedure. Because interactive simulators are typically designed for specific robots that perform a small number of tasks controlled by a single user, they are inflexible in terms of their portability to different robots and the inclusion of multiple operators (e.g., nurses, medical assistants). Additionally, while most simulators provide high-quality visuals, simplification techniques are often employed to avoid stability issues for physics computation, contact dynamics and multi-manual interaction. This study addresses the limitations of existing simulators by outlining various specifications required to develop techniques that mimic real-world interactions and collaboration. Moreover, this study focuses on the inclusion of distributed control, shared task allocation and assistive feedback -- through machine learning, secondary and tertiary operators -- alongside the primary human operator
    corecore