16 research outputs found

    Graphic Processing Units (GPUs)-Based Haptic Simulator for Dental Implant Surgery

    Get PDF
    This paper presents a haptics-based training simulator for dental implant surgery. Most of the previously developed dental simulators are targeted for exploring and drilling purpose only. The penalty-based contact force models with spherical-shaped dental tools are often adopted for simplicity and computational efficiency. In contrast, our simulator is equipped with a more precise force model adapted from the Voxmap-PointShell (VPS) method to capture the essential features of the drilling procedure, with no limitations on drill shape. In addition, a real-time torque model is proposed to simulate the torque resistance in the implant insertion procedure, based on patient-specific tissue properties and implant geometry. To achieve better anatomical accuracy, our oral model is reconstructed from cone beam computed tomography (CBCT) images with a voxel-based method. To enhance the real-time response, the parallel computing power of GPUs is exploited through extra efforts in data structure design, algorithms parallelization, and graphic memory utilization. Results show that the developed system can produce appropriate force feedback at different tissue layers during pilot drilling and can create proper resistance torque responses during implant insertion

    Virtual Reality Based Environment for Orthopedic Surgery (Veos)

    Get PDF
    The traditional way of teaching surgery involves students observing a �live� surgery and then gradually assisting experienced surgeons. The creation of a Virtual Reality environment for orthopedic surgery (VEOS) can be beneficial in improving the quality of training while decreasing the time needed for training. Developing such virtual environments for educational and training purposes can supplement existing approaches. In this research, the design and development of a virtual reality based environment for orthopedic surgery is described. The scope of the simulation environment is restricted to an orthopedic surgery process known as Less Invasive Stabilization System (LISS) surgery. The primary knowledge source for the LISS surgical process was Miguel A. Pirela-Cruz (Head of Orthopedic Surgery and Rehabilitation, Texas Tech University Health Sciences Center (TTHSC)). The VEOS was designed and developed on a PC based platform. The developed VEOS was validated through interactions with surgical residents at TTHSC. Feedback from residents and our collaborator Miguel A. Pirela-Cruz was used to make necessary modifications to the surgical environment.Industrial Engineering & Managemen

    Investigation of a holistic human-computer interaction (HCI) framework to support the design of extended reality (XR) based training simulators

    Get PDF
    In recent years, the use of Extended Reality (XR) based simulators for training has increased rapidly. In this context, there is a need to explore novel HCI-based approaches to design more effective 3D training environments. A major impediment in this research area is the lack of an HCI-based framework that is holistic and serves as a foundation to integrate the design and assessment of HCI-based attributes such as affordance, cognitive load, and user-friendliness. This research addresses this need by investigating the creation of a holistic framework along with a process for designing, building, and assessing training simulators using such a framework as a foundation. The core elements of the proposed framework include the adoption of participatory design principles, the creation of information-intensive process models of target processes (relevant to the training activities), and design attributes related to affordance and cognitive load. A new attribute related to affordance of 3D scenes is proposed (termed dynamic affordance) and its role in impacting user comprehension in data-rich 3D training environments is studied. The framework is presented for the domain of orthopedic surgery. Rigorous user-involved assessment of the framework and simulation approach has highlighted the positive impact of the HCI-based framework and attributes on the acquisition of skills and knowledge by healthcare users

    Radiological Society of North America (RSNA) 3D printing Special Interest Group (SIG): Guidelines for medical 3D printing and appropriateness for clinical scenarios

    Get PDF
    Este número da revista Cadernos de Estudos Sociais estava em organização quando fomos colhidos pela morte do sociólogo Ernesto Laclau. Seu falecimento em 13 de abril de 2014 surpreendeu a todos, e particularmente ao editor Joanildo Burity, que foi seu orientando de doutorado na University of Essex, Inglaterra, e que recentemente o trouxe à Fundação Joaquim Nabuco para uma palestra, permitindo que muitos pudessem dialogar com um dos grandes intelectuais latinoamericanos contemporâneos. Assim, buscamos fazer uma homenagem ao sociólogo argentino publicando uma entrevista inédita concedida durante a sua passagem pelo Recife, em 2013, encerrando essa revista com uma sessão especial sobre a sua trajetória

    Semi-Robotic Knee Arthroscopy System with Braking Mechanism

    Get PDF
    To alleviate the poor ergonomics which surgeons suffer during knee arthroscopy, a semi-robotic device with braking mechanism is created for intraoperative assistance. A slitted ball joint assembly is developed to transmit the clamping force to the arthroscope inside. Ball deformation and stress at various angles to the vertical and clamping forces is recorded through Abaqus Finite Element Analysis (FEA). Contact forces between the scope and inner surfaces of the ball is also computed in FEA at different clamping forces. The von Mises stress occurring in the ball joint is under the yield stress limit for polyethylene, and there is noticeable force preventing the scope from sliding along the ball through-hole under clamping. A prototype of this device is constructed for proof-of-concept

    Advanced Applications of Rapid Prototyping Technology in Modern Engineering

    Get PDF
    Rapid prototyping (RP) technology has been widely known and appreciated due to its flexible and customized manufacturing capabilities. The widely studied RP techniques include stereolithography apparatus (SLA), selective laser sintering (SLS), three-dimensional printing (3DP), fused deposition modeling (FDM), 3D plotting, solid ground curing (SGC), multiphase jet solidification (MJS), laminated object manufacturing (LOM). Different techniques are associated with different materials and/or processing principles and thus are devoted to specific applications. RP technology has no longer been only for prototype building rather has been extended for real industrial manufacturing solutions. Today, the RP technology has contributed to almost all engineering areas that include mechanical, materials, industrial, aerospace, electrical and most recently biomedical engineering. This book aims to present the advanced development of RP technologies in various engineering areas as the solutions to the real world engineering problems

    XXII International Conference on Mechanics in Medicine and Biology - Abstracts Book

    Get PDF
    This book contain the abstracts presented the XXII ICMMB, held in Bologna in September 2022. The abstracts are divided following the sessions scheduled during the conference

    Radiological Society of North America (RSNA) 3D printing Special Interest Group (SIG): guidelines for medical 3D printing and appropriateness for clinical scenarios

    Get PDF
    Abstract Medical three-dimensional (3D) printing has expanded dramatically over the past three decades with growth in both facility adoption and the variety of medical applications. Consideration for each step required to create accurate 3D printed models from medical imaging data impacts patient care and management. In this paper, a writing group representing the Radiological Society of North America Special Interest Group on 3D Printing (SIG) provides recommendations that have been vetted and voted on by the SIG active membership. This body of work includes appropriate clinical use of anatomic models 3D printed for diagnostic use in the care of patients with specific medical conditions. The recommendations provide guidance for approaches and tools in medical 3D printing, from image acquisition, segmentation of the desired anatomy intended for 3D printing, creation of a 3D-printable model, and post-processing of 3D printed anatomic models for patient care.https://deepblue.lib.umich.edu/bitstream/2027.42/146524/1/41205_2018_Article_30.pd

    Augmented Reality and Artificial Intelligence in Image-Guided and Robot-Assisted Interventions

    Get PDF
    In minimally invasive orthopedic procedures, the surgeon places wires, screws, and surgical implants through the muscles and bony structures under image guidance. These interventions require alignment of the pre- and intra-operative patient data, the intra-operative scanner, surgical instruments, and the patient. Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. State of the art approaches often support the surgeon by using external navigation systems or ill-conditioned image-based registration methods that both have certain drawbacks. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. Consequently, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. This dissertation investigates the applications of AR, artificial intelligence, and robotics in interventional medicine. Our solutions were applied in a broad spectrum of problems for various tasks, namely improving imaging and acquisition, image computing and analytics for registration and image understanding, and enhancing the interventional visualization. The benefits of these approaches were also discovered in robot-assisted interventions. We revealed how exemplary workflows are redefined via AR by taking full advantage of head-mounted displays when entirely co-registered with the imaging systems and the environment at all times. The proposed AR landscape is enabled by co-localizing the users and the imaging devices via the operating room environment and exploiting all involved frustums to move spatial information between different bodies. The system's awareness of the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. We also leveraged the principles governing image formation and combined it with deep learning and RGBD sensing to fuse images and reconstruct interventional data. We hope that our holistic approaches towards improving the interface of surgery and enhancing the usability of interventional imaging, not only augments the surgeon's capabilities but also augments the surgical team's experience in carrying out an effective intervention with reduced complications
    corecore