186 research outputs found

    Robot Autonomy for Surgery

    Full text link
    Autonomous surgery involves having surgical tasks performed by a robot operating under its own will, with partial or no human involvement. There are several important advantages of automation in surgery, which include increasing precision of care due to sub-millimeter robot control, real-time utilization of biosignals for interventional care, improvements to surgical efficiency and execution, and computer-aided guidance under various medical imaging and sensing modalities. While these methods may displace some tasks of surgical teams and individual surgeons, they also present new capabilities in interventions that are too difficult or go beyond the skills of a human. In this chapter, we provide an overview of robot autonomy in commercial use and in research, and present some of the challenges faced in developing autonomous surgical robots

    The FreeD - A Handheld Digital Milling Device for Craft and Fabrication

    Get PDF
    We present an approach to combine digital fabrication and craft that is focused on a new fabrication experience. The FreeD is a hand-held, digitally controlled, milling device. It is guided and monitored by a computer while still preserving gestural freedom. The computer intervenes only when the milling bit approaches the 3D model, which was designed beforehand, either by slowing down the spindle's speed or by drawing back the shaft. The rest of the time it allows complete freedom, allowing the user to manipulate and shape the work in any creative way. We believe The FreeD will enable a designer to move in between the straight boundaries of established CAD systems and the free expression of handcraft

    Robots and tools for remodeling bone

    Get PDF
    The field of robotic surgery has progressed from small teams of researchers repurposing industrial robots, to a competitive and highly innovative subsection of the medical device industry. Surgical robots allow surgeons to perform tasks with greater ease, accuracy, or safety, and fall under one of four levels of autonomy; active, semi-active, passive, and remote manipulator. The increased accuracy afforded by surgical robots has allowed for cementless hip arthroplasty, improved postoperative alignment following knee arthroplasty, and reduced duration of intraoperative fluoroscopy among other benefits. Cutting of bone has historically used tools such as hand saws and drills, with other elaborate cutting tools now used routinely to remodel bone. Improvements in cutting accuracy and additional options for safety and monitoring during surgery give robotic surgeries some advantages over conventional techniques. This article aims to provide an overview of current robots and tools with a common target tissue of bone, proposes a new process for defining the level of autonomy for a surgical robot, and examines future directions in robotic surgery

    Thermal damage done to bone by burring and sawing with and without irrigation in knee arthroplasty

    Get PDF
    Heat from bone resecting tools used in knee surgery can induce thermal osteonecrosis, potentially causing aseptic implant loosening. This study compared oscillating saws to burrs in terms of temperature generation and histologic damage. Use of irrigation to reduce bone temperature was also investigated. Temperatures were recorded during sawing and burring with or without irrigation (uncooled or cooled). Histologic analyses were then carried out. Differences between groups were tested statistically (α = 0.05). On average, burring produced higher temperatures than sawing (P < .001). When uncooled irrigation was used, bone temperatures were significantly lower in sawed bone than in burred bone (P < .001). Irrigation lowered temperatures and thermal damage depths and increased osteocyte viability (P < .001). These results suggest that irrigating bone during resection could prevent osteonecrosis onset

    Augmented manual fabrication methods for 2D tool positioning and 3D sculpting

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 67-75).Augmented manual fabrication involves using digital technology to assist a user engaged in a manual fabrication task. Methods in this space aim to combine the abilities of a human operator, such as motion planning and large-range mechanical manipulation, with technological capabilities that compensate for the operator's areas of weakness, such as precise 3D sensing, manipulation of complex shape data, and millimeter-scale actuation. This thesis presents two new augmented manual fabrication methods. The first is a method for helping a sculptor create an object that precisely matches the shape of a digital 3D model. In this approach, a projector-camera pair is used to scan a sculpture in progress, and the resulting scan data is compared to the target 3D model. The system then computes the changes necessary to bring the physical sculpture closer to the target 3D shape, and projects guidance directly onto the sculpture that indicates where and how the sculpture should be changed, such as by adding or removing material. We describe multiple types of guidance that can be used to direct the sculptor, as well as several related applications of this technique. The second method described in this thesis is a means of precisely positioning a handheld tool on a sheet of material using a hybrid digital-manual approach. An operator is responsible for manually moving a frame containing the tool to the approximate neighborhood of the desired position. The device then detects the frame's position and uses digitally-controlled actuators to move the tool within the frame to the exact target position. By doing this in a real time feedback loop, a tool can be smoothly moved along a digitally-specified 2D path, allowing many types of digital fabrication over an unlimited range using an inexpensive handheld tool.by Alec Rivers.Ph.D

    Form giving through gestural interaction to shape changing objects

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2012.Cataloged from PDF version of thesis.Includes bibliographical references.Shape-shifting materials have been part of sci-fi literature for decades. But if tomorrow we invent them, how are we going to communicate to them what shape we want them to morph into? If we look at our history, for thousands of years humans have been using the dexterity of their hands as primary means to alter the topology of their surroundings. While direct manipulation, as a primary method for form giving, allows for high precision deformation, the scope of interaction is limited to the scale of the hand. In order to extend the scope of manipulation beyond the hand scale, tools were invented to reach further and to augment the capabilities of our hands. In this thesis, I propose "Amphorm", a perceptually equivalent example of Radical Atoms, our vision on the interaction techniques for future, highly malleable, shape-shifting materials. "Amphorm" is a cylindrical kinetic sculpture that resembles a vase. Since "Amphorm" is a dual citizen between the digital and the physical world, its shape can be altered in both worlds. I describe novel interaction techniques for rapid shape deformation both in the physical world through free hand gestures and in the digital world through a Graphical User Interface. Additionally I explore how the physical world could be synchronized with the digital world and how tools from both worlds can jointly alter dual-citizens.by Dávid Lakatos.S.M

    Concept and Design of a Hand-held Mobile Robot System for Craniotomy

    Get PDF
    This work demonstrates a highly intuitive robot for Surgical Craniotomy Procedures. Utilising a wheeled hand-held robot, to navigate the Craniotomy Drill over a patient\u27s skull, the system does not remove the surgeons from the procedure, but supports them during this critical phase of the operation

    Virtual sculpting with advanced gestural interface

    Get PDF
    Ankara : The Department of Computer Engineering and the Graduate School of Engineering and Science of Bilkent University, 2013.Thesis (Master's) -- Bilkent University, 2013.Includes bibliographical references leaves 54-58.In this study, we propose a virtual reality application that can be utilized to design preliminary/conceptual models similar to real world clay sculpting. The proposed system makes use of the innovative gestural interface that enhances the experience of the human-computer interaction. The gestural interface employs advanced motion capture hardware namely data gloves and six-degrees-of-freedom position tracker instead of classical input devices like keyboard or mouse. The design process takes place in the virtual environment that contains volumetric deformable model, design tools and a virtual hand that is driven by the data glove and the tracker. The users manipulate the design tools and the deformable model via the virtual hand. The deformation on the model is done by stuffing or carving material (voxels) in or out of the model with the help of the tools or directly by the virtual hand. The virtual sculpting system also includes volumetric force feedback indicator that provides visual aid. We also offer a mouse like interaction approach in which the users can still interact with conventional graphical user interface items such as buttons with the data glove and tracker. The users can also control the application with gestural commands thanks to our real time trajectory based dynamic gesture recognition algorithm. The gesture recognition technique exploits a fast learning mechanism that does not require extensive training data to teach gestures to the system. For recognition, gestures are represented as an ordered sequence of directional movements in 2D. In the learning phase, sample gesture data is filtered and processed to create gesture recognizers, which are basically finite-state machine sequence recognizers. We achieve real time gesture recognition by these recognizers without needing to specify gesture start and end points. The results of the conducted user study show that the proposed method is very promising in terms of gesture detection and recognition performance (73% accuracy) in a stream of motion. Additionally, the assessment of the user attitude survey denotes that the gestural interface is very useful and satisfactory. One of the novel parts of the proposed approach is that it gives users the freedom to create gesture commands according to their preferences for selected tasks. Thus, the presented gesture recognition approach makes the human-computer interaction process more intuitive and user specific.Kılıboz, Nurettin ÇağrıM.S

    Accurate 3D reconstruction of bony surfaces using ultrasonic synthetic aperture techniques for robotic knee arthroplasty

    Get PDF
    Robotically guided knee arthroplasty systems generally require an individualized, preoperative 3D model of the knee joint. This is typically measured using Computed Tomography (CT) which provides the required accuracy for preoperative surgical intervention planning. Ultrasound imaging presents an attractive alternative to CT, allowing for reductions in cost and the elimination of doses of ionizing radiation, whilst maintaining the accuracy of the 3D model reconstruction of the joint. Traditional phased array ultrasound imaging methods, however, are susceptible to poor resolution and signal to noise ratios (SNR). Alleviating these weaknesses by offering superior focusing power, synthetic aperture methods have been investigated extensively within ultrasonic non-destructive testing. Despite this, they have yet to be fully exploited in medical imaging. In this paper, the ability of a robotic deployed ultrasound imaging system based on synthetic aperture methods to accurately reconstruct bony surfaces is investigated. Employing the Total Focussing Method (TFM) and the Synthetic Aperture Focussing Technique (SAFT), two samples were imaged which were representative of the bones of the knee joint: a human-shaped, composite distal femur and a bovine distal femur. Data were captured using a 5MHz, 128 element 1D phased array, which was manipulated around the samples using a robotic positioning system. Three dimensional surface reconstructions were then produced and compared with reference models measured using a precision laser scanner. Mean errors of 0.82 mm and 0.88 mm were obtained for the composite and bovine samples, respectively, thus demonstrating the feasibility of the approach to deliver the sub-millimetre accuracy required for the application

    Augmented reality for computer assisted orthopaedic surgery

    Get PDF
    In recent years, computer-assistance and robotics have established their presence in operating theatres and found success in orthopaedic procedures. Benefits of computer assisted orthopaedic surgery (CAOS) have been thoroughly explored in research, finding improvements in clinical outcomes, through increased control and precision over surgical actions. However, human-computer interaction in CAOS remains an evolving field, through emerging display technologies including augmented reality (AR) – a fused view of the real environment with virtual, computer-generated holograms. Interactions between clinicians and patient-specific data generated during CAOS are limited to basic 2D interactions on touchscreen monitors, potentially creating clutter and cognitive challenges in surgery. Work described in this thesis sought to explore the benefits of AR in CAOS through: an integration between commercially available AR and CAOS systems, creating a novel AR-centric surgical workflow to support various tasks of computer-assisted knee arthroplasty, and three pre–clinical studies exploring the impact of the new AR workflow on both existing and newly proposed quantitative and qualitative performance metrics. Early research focused on cloning the (2D) user-interface of an existing CAOS system onto a virtual AR screen and investigating any resulting impacts on usability and performance. An infrared-based registration system is also presented, describing a protocol for calibrating commercial AR headsets with optical trackers, calculating a spatial transformation between surgical and holographic coordinate frames. The main contribution of this thesis is a novel AR workflow designed to support computer-assisted patellofemoral arthroplasty. The reported workflow provided 3D in-situ holographic guidance for CAOS tasks including patient registration, pre-operative planning, and assisted-cutting. Pre-clinical experimental validation on a commercial system (NAVIO®, Smith & Nephew) for these contributions demonstrates encouraging early-stage results showing successful deployment of AR to CAOS systems, and promising indications that AR can enhance the clinician’s interactions in the future. The thesis concludes with a summary of achievements, corresponding limitations and future research opportunities.Open Acces
    • …
    corecore