216 research outputs found

    A novel haptic model and environment for maxillofacial surgical operation planning and manipulation

    Get PDF
    This paper presents a practical method and a new haptic model to support manipulations of bones and their segments during the planning of a surgical operation in a virtual environment using a haptic interface. To perform an effective dental surgery it is important to have all the operation related information of the patient available beforehand in order to plan the operation and avoid any complications. A haptic interface with a virtual and accurate patient model to support the planning of bone cuts is therefore critical, useful and necessary for the surgeons. The system proposed uses DICOM images taken from a digital tomography scanner and creates a mesh model of the filtered skull, from which the jaw bone can be isolated for further use. A novel solution for cutting the bones has been developed and it uses the haptic tool to determine and define the bone-cutting plane in the bone, and this new approach creates three new meshes of the original model. Using this approach the computational power is optimized and a real time feedback can be achieved during all bone manipulations. During the movement of the mesh cutting, a novel friction profile is predefined in the haptical system to simulate the force feedback feel of different densities in the bone

    Real-time Medical Visualization of Human Head and Neck Anatomy and its Applications for Dental Training and Simulation

    Get PDF
    The Digital Design Studio and NHS Education Scotland have developed ultra-high definition real-time interactive 3D anatomy of the head and neck for dental teaching, training and simulation purposes. In this paper we present an established workflow using state-of-the-art 3D laser scanning technology and software for design and construction of medical data and describe the workflow practices and protocols in the head and neck anatomy project. Anatomical data was acquired through topographical laser scanning of a destructively dissected cadaver. Each stage of model development was clinically validated to produce a normalised human dataset which was transformed into a real-time environment capable of large-scale 3D stereoscopic display in medical teaching labs across Scotland, whilst also supporting single users with laptops and PC. Specific functionality supported within the 3D Head and Neck viewer includes anatomical labelling, guillotine tools and selection tools to expand specific local regions of anatomy. The software environment allows thorough and meaningful investigation to take place of all major and minor anatomical structures and systems whilst providing the user with the means to record sessions and individual scenes for learning and training purposes. The model and software have also been adapted to permit interactive haptic simulation of the injection of a local anesthetic

    3D oceanographic data compression using 3D-ODETLAP

    Get PDF
    This paper describes a 3D environmental data compression technique for oceanographic datasets. With proper point selection, our method approximates uncompressed marine data using an over-determined system of linear equations based on, but essentially different from, the Laplacian partial differential equation. Then this approximation is refined via an error metric. These two steps work alternatively until a predefined satisfying approximation is found. Using several different datasets and metrics, we demonstrate that our method has an excellent compression ratio. To further evaluate our method, we compare it with 3D-SPIHT. 3D-ODETLAP averages 20% better compression than 3D-SPIHT on our eight test datasets, from World Ocean Atlas 2005. Our method provides up to approximately six times better compression on datasets with relatively small variance. Meanwhile, with the same approximate mean error, we demonstrate a significantly smaller maximum error compared to 3D-SPIHT and provide a feature to keep the maximum error under a user-defined limit

    Real-time Medical Visualization of Human Head and Neck Anatomy and its Applications for Dental Training and Simulation

    Get PDF
    The Digital Design Studio and NHS Education Scotland have developed ultra-high definition real-time interactive 3D anatomy of the head and neck for dental teaching, training and simulation purposes. In this paper we present an established workflow using state-of-the-art 3D laser scanning technology and software for design and construction of medical data and describe the workflow practices and protocols in the head and neck anatomy project. Anatomical data was acquired through topographical laser scanning of a destructively dissected cadaver. Each stage of model development was clinically validated to produce a normalised human dataset which was transformed into a real-time environment capable of large-scale 3D stereoscopic display in medical teaching labs across Scotland, whilst also supporting single users with laptops and PC. Specific functionality supported within the 3D Head and Neck viewer includes anatomical labelling, guillotine tools and selection tools to expand specific local regions of anatomy. The software environment allows thorough and meaningful investigation to take place of all major and minor anatomical structures and systems whilst providing the user with the means to record sessions and individual scenes for learning and training purposes. The model and software have also been adapted to permit interactive haptic simulation of the injection of a local anaesthetic

    A Review of Virtual Reality Based Training Simulators for Orthopaedic Surgery

    Get PDF
    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 total hip replacement pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator

    Interactive 3D Digital Models for Anatomy and Medical Education

    Get PDF
    This chapter explores the creation and use of interactive, three-dimensional (3D), digital models for anatomy and medical education. Firstly, it looks back over the history and development of virtual 3D anatomy resources before outlining some of the current means of their creation; including photogrammetry, CT and surface scanning, and digital modelling, outlining advantages and disadvantages for each. Various means of distribution are explored, including; virtual learning environments, websites, interactive PDF’s, virtual and augmented reality, bespoke applications, and 3D printing, with a particular focus on the level of interactivity each method offers. Finally, and perhaps most importantly, the use of such models for education is discussed. Questions addressed include; How can such models best be used to enhance student learning? How can they be used in the classroom? How can they be used for selfdirected study? As well as exploring if they could one day replace human specimens, and how they complement the rise of online and e-learning

    Visualization and Analysis Tools for Neuronal Tissue

    Get PDF
    The complex nature of neuronal cellular and circuit structure poses challenges for understanding tissue organization. New techniques in electron microscopy allow for large datasets to be acquired from serial sections of neuronal tissue. These techniques reveal all cells in an unbiased fashion, so their segmentation produces complex structures that must be inspected and analyzed. Although several software packages provide 3D representations of these structures, they are limited to monoscopic projection, and are tailored to the visualization of generic 3D data. On the other hand, stereoscopic display has been shown to improve the immersive experience, with significant gains in understanding spatial relationships and identifying important features. To leverage those benefits, we have developed a 3D immersive virtual reality data display system that besides presenting data visually allows augmenting and interacting with them in a form that facilitates human analysis.;To achieve a useful system for neuroscientists, we have developed the BrainTrek system, which is a suite of software applications suited for the organization, rendering, visualization, and modification of neuron model scenes. A middle cost point CAVE system provides high vertex count rendering of an immersive 3D environment. A standard head- and wand-tracking allows movement control and modification of the scene via the on-screen, 3D menu, while a tablet touch screen provides multiple navigation modes and a 2D menu. Graphic optimization provides theoretically limitless volumes to be presented and an on-screen mini-map allows users to quickly orientate themselves. A custom voice note-taking mechanism has been installed, allowing scenes to be described and revisited. Finally, ray-casting support allows numerous analytical features, including 3D distance and volume measurements, computation and presentation of statistics, and point-and-click retrieval and presentation of raw electron microscopy data. The extension of this system to the Unity3D platform provides a low-cost alternative to the CAVE. This allows users to visualize, explore, and annotate 3D cellular data in multiple platforms and modalities, ranging from different operating systems, different hardware platforms (e.g., tablets, PCs, or stereo head-mounted displays), to operating in an online or off-line fashion. Such approach has the potential to not only address visualization and analysis needs of neuroscientists, but also to become a tool for educational purposes, as well as for crowdsourcing upcoming needs for sheer amounts of neuronal data annotation

    Haptic Interaction with 3D oriented point clouds on the GPU

    Get PDF
    Real-time point-based rendering and interaction with virtual objects is gaining popularity and importance as di�erent haptic devices and technologies increasingly provide the basis for realistic interaction. Haptic Interaction is being used for a wide range of applications such as medical training, remote robot operators, tactile displays and video games. Virtual object visualization and interaction using haptic devices is the main focus; this process involves several steps such as: Data Acquisition, Graphic Rendering, Haptic Interaction and Data Modi�cation. This work presents a framework for Haptic Interaction using the GPU as a hardware accelerator, and includes an approach for enabling the modi�cation of data during interaction. The results demonstrate the limits and capabilities of these techniques in the context of volume rendering for haptic applications. Also, the use of dynamic parallelism as a technique to scale the number of threads needed from the accelerator according to the interaction requirements is studied allowing the editing of data sets of up to one million points at interactive haptic frame rates
    • …
    corecore