75 research outputs found

    Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery

    Get PDF
    One of the main challenges for computer-assisted surgery (CAS) is to determine the intra-opera- tive morphology and motion of soft-tissues. This information is prerequisite to the registration of multi-modal patient-specific data for enhancing the surgeon’s navigation capabilites by observ- ing beyond exposed tissue surfaces and for providing intelligent control of robotic-assisted in- struments. In minimally invasive surgery (MIS), optical techniques are an increasingly attractive approach for in vivo 3D reconstruction of the soft-tissue surface geometry. This paper reviews the state-of-the-art methods for optical intra-operative 3D reconstruction in laparoscopic surgery and discusses the technical challenges and future perspectives towards clinical translation. With the recent paradigm shift of surgical practice towards MIS and new developments in 3D opti- cal imaging, this is a timely discussion about technologies that could facilitate complex CAS procedures in dynamic and deformable anatomical regions

    Microfabricated tactile sensors for biomedical applications: a review

    Get PDF
    During the last decades, tactile sensors based on different sensing principles have been developed due to the growing interest in robotics and, mainly, in medical applications. Several technological solutions have been employed to design tactile sensors; in particular, solutions based on microfabrication present several attractive features. Microfabrication technologies allow for developing miniaturized sensors with good performance in terms of metrological properties (e.g., accuracy, sensitivity, low power consumption, and frequency response). Small size and good metrological properties heighten the potential role of tactile sensors in medicine, making them especially attractive to be integrated in smart interfaces and microsurgical tools. This paper provides an overview of microfabricated tactile sensors, focusing on the mean principles of sensing, i.e., piezoresistive, piezoelectric and capacitive sensors. These sensors are employed for measuring contact properties, in particular force and pressure, in three main medical fields, i.e., prosthetics and artificial skin, minimal access surgery and smart interfaces for biomechanical analysis. The working principles and the metrological properties of the most promising tactile, microfabricated sensors are analyzed, together with their application in medicine. Finally, the new emerging technologies in these fields are briefly described

    Evaluation of Wearable Sensing in Mixed Reality for Mobile Teleoperation

    Get PDF
    Teleoperation platforms often require the user to be situated at a fixed location to both visualize and control the movement of the robot and thus do not provide the operator with much mobility. One example of such systems is in existing robotic surgery solutions that require the surgeons to be away from the patient, attached to consoles where their heads must be fixed and their arms can only move in a limited space. This creates a barrier between physicians and patients that does not exist in normal surgery. To address this issue, we propose a mobile telesurgery solution where the surgeons are no longer mechanically limited to control consoles and are able to teleoperate the robots from the patient bedside, using their arms equipped with wireless sensors and viewing the endoscope video via optical see-through HMDs. In this work, we develop and evaluate a mobile telesurgery system based on a Microsoft HoloLens HMD and using three Inertial Measurement Units (IMUs) mounted on the user's arm. Two IMUs are strapped to the upper arm and forearm, with the third IMU in a hand-held device. We perform experiments to compare the proposed system to a conventional telesurgery platform based on the master console of a da Vinci surgical system

    Veröffentlichungen und VortrĂ€ge 2009 der Mitglieder der FakultĂ€t fĂŒr Informatik

    Get PDF

    Directional Estimation for Robotic Beating Heart Surgery

    Get PDF
    In robotic beating heart surgery, a remote-controlled robot can be used to carry out the operation while automatically canceling out the heart motion. The surgeon controlling the robot is shown a stabilized view of the heart. First, we consider the use of directional statistics for estimation of the phase of the heartbeat. Second, we deal with reconstruction of a moving and deformable surface. Third, we address the question of obtaining a stabilized image of the heart

    Multimodal Noncontact Tracking of Surgical Instruments

    Get PDF
    For many procedures, open surgery is being replaced with minimally invasive surgical (MIS) techniques. The advantages of MIS include reduced operative trauma and fewer complications leading to faster patient recovery, better cosmetic results and shorter hospital stays. As the demand for MIS procedures increases, effective surgical training tools must be developed to improve procedure efficiency and patient safety. Motion tracking of laparoscopic instruments can provide objective skills assessment for novices and experienced users. The most common approaches to noncontact motion capture are optical and electromagnetic (EM) tracking systems, though each approach has operational limitations. Optical trackers are prone to occlusion and the performance of EM trackers degrades in the presence of magnetic and ferromagnetic material. The cost of these systems also limits their availability for surgical training and clinical environments. This thesis describes the development and validation of a novel, noncontact laparoscopic tracking system as an inexpensive alternative to current technology. This system is based on the fusion of inertial, magnetic and distance sensing to generate real-time, 6-DOF pose data. Orientation is estimated using a Kalman-filtered attitude-heading reference system (AHRS) and restricted motion at the trocar provides a datum from which position information can be recovered. The Inertial and Range-Enhanced Surgical (IRES) Tracker was prototyped, then validated using a MIS training box and by comparison to an EM tracking system. Results of IRES tracker testing showed similar performance to an EM tracker with position error as low as 1.25 mm RMS and orientation error \u3c0.58 degrees RMS along each axis. The IRES tracker also displayed greater precision and superior magnetic interference rejection capabilities. At a fraction of the cost of current laparoscopic tracking methods, the IRES tracking system would provide an excellent alternative for use in surgical training and skills assessment

    On Simultaneous Localization and Mapping inside the Human Body (Body-SLAM)

    Get PDF
    Wireless capsule endoscopy (WCE) offers a patient-friendly, non-invasive and painless investigation of the entire small intestine, where other conventional wired endoscopic instruments can barely reach. As a critical component of the capsule endoscopic examination, physicians need to know the precise position of the endoscopic capsule in order to identify the position of intestinal disease after it is detected by the video source. To define the position of the endoscopic capsule, we need to have a map of inside the human body. However, since the shape of the small intestine is extremely complex and the RF signal propagates differently in the non-homogeneous body tissues, accurate mapping and localization inside small intestine is very challenging. In this dissertation, we present an in-body simultaneous localization and mapping technique (Body-SLAM) to enhance the positioning accuracy of the WCE inside the small intestine and reconstruct the trajectory the capsule has traveled. In this way, the positions of the intestinal diseases can be accurately located on the map of inside human body, therefore, facilitates the following up therapeutic operations. The proposed approach takes advantage of data fusion from two sources that come with the WCE: image sequences captured by the WCE\u27s embedded camera and the RF signal emitted by the capsule. This approach estimates the speed and orientation of the endoscopic capsule by analyzing displacements of feature points between consecutive images. Then, it integrates this motion information with the RF measurements by employing a Kalman filter to smooth the localization results and generate the route that the WCE has traveled. The performance of the proposed motion tracking algorithm is validated using empirical data from the patients and this motion model is later imported into a virtual testbed to test the performance of the alternative Body-SLAM algorithms. Experimental results show that the proposed Body-SLAM technique is able to provide accurate tracking of the WCE with average error of less than 2.3cm

    Ameliorating integrated sensor drift and imperfections: an adaptive "neural" approach

    Get PDF
    • 

    corecore