17 research outputs found

    Modeling and Visualization for Virtual Interaction with Medical Image Data

    No full text
    Interactive systems for exploring and analysing medical three dimensional (3D) volume image data using techniques such as stereoscopic rendering and haptics can lead to new workflows for virtual surgery planning. This includes the design of patient-specific surgical guides and plates for additive manufacturing (3D printing). Our applications, medical visualization and cranio-maxillofacial surgery planning, involve large volume data such as computed tomo\-graphy (CT) images with millions of data points. This motivates the development of fast and efficient methods for visualization and haptic rendering, as well as the development of efficient modeling techniques for simplifying the design of 3D printable parts. In this thesis, we develop methods for visualization and haptic rendering of isosurfaces in volume image data, and show applications of these methods to medical visualization and virtual surgery planning. We further develop methods for modeling surgical guides and plates for cranio-maxillofacial surgery, and integrate them into our system for haptics-assisted surgery planning called HASP. This system is now installed at the department of surgical sciences, Uppsala University, and is being evaluated for use in clinical research

    Rendering Software for Multiple Projectors

    No full text

    Clustered Grid Cell Data Structure for Isosurface Rendering

    Get PDF
    Active grid cells in scalar volume data are typically identified by many isosurface rendering methods when extracting another representation of the data for rendering. However, the use of grid cells themselves as rendering primitives is not extensively explored in the literature. In this paper, we propose a cluster-based data structure for storing the data of active grid cells for fast cell rasterisation via billboard splatting. Compared to previous cell rasterisation approaches, eight corner scalar values are stored with each active grid cell, so that the full volume data is not required during rendering. The grid cells can be quickly extracted and use about 37 percent memory compared to a typical efficient mesh-based representation, while supporting large grid sizes. We present further improvements such as a visibility buffer for cluster culling and EWA-based interpolation of attributes such as normals. We also show that our data structure can be used for hybrid ray tracing or path tracing to compute global illumination

    Modeling and Visualization for Virtual Interaction with Medical Image Data

    No full text
    Interactive systems for exploring and analysing medical three dimensional (3D) volume image data using techniques such as stereoscopic rendering and haptics can lead to new workflows for virtual surgery planning. This includes the design of patient-specific surgical guides and plates for additive manufacturing (3D printing). Our applications, medical visualization and cranio-maxillofacial surgery planning, involve large volume data such as computed tomo\-graphy (CT) images with millions of data points. This motivates the development of fast and efficient methods for visualization and haptic rendering, as well as the development of efficient modeling techniques for simplifying the design of 3D printable parts. In this thesis, we develop methods for visualization and haptic rendering of isosurfaces in volume image data, and show applications of these methods to medical visualization and virtual surgery planning. We further develop methods for modeling surgical guides and plates for cranio-maxillofacial surgery, and integrate them into our system for haptics-assisted surgery planning called HASP. This system is now installed at the department of surgical sciences, Uppsala University, and is being evaluated for use in clinical research

    Rendering Software for Multiple Projectors

    No full text

    RayCaching : Amortized Isosurface Rendering for Virtual Reality

    No full text
    Real‐time virtual reality requires efficient rendering methods to deal with high‐ resolution stereoscopic displays and low latency head‐tracking. Our proposed RayCaching method renders isosurfaces of large volume datasets by amortizing raycasting over several frames and caching primary rays as small bricks that can be efficiently rasterized. An occupancy map in form of a clipmap provides level of detail and ensures that only bricks corresponding to visible points on the isosurface are being cached and rendered. Hard shadows and ambient occlusion from secondary rays are also accumulated and stored in the cache. Our method supports real‐time isosurface rendering with dynamic isovalue and allows stereoscopic visualization and exploration of large volume datasets at framerates suitable for virtual reality applications

    A haptics-assisted cranio-maxillofacial surgery planning system for restoring skeletal anatomy in complex trauma cases

    No full text
    Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface. A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment. A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result. Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time

    Using anti-aliased signed distance fields for generating surgical guides and plates from CT images

    Get PDF
    We present a method for generating shell-like objects such as surgical guides and plates from segmented computed tomography (CT) images, using signed distance fields and constructive solid geometry (CSG). We develop a userfriendly modeling tool which allows a user to quickly design such models with the help of stereo graphics, six degrees-of-freedom input, and haptic feedback, in our existing software for virtual cranio-maxiollofacial surgery planning, HASP. To improve the accuracy and precision of the modeling, we use an anti-aliased distance transform to compute signed distance field values from fuzzy coverage representations of the bone. The models can be generated within a few minutes, with only a few interaction steps, and are 3D printable. The tool has potential to be used by the surgeons themselves, as an alternative to traditional surgery planning services

    Evaluation of in-house, haptic assisted surgical planning for virtual reduction of complex mandibular fractures

    No full text
    The management of complex mandible fractures, i.e severely comminuted or fractures of edentulous/atrophic mandibles, can be challenging. This is due to the three-dimensional loss of bone, which limits the possibility for accurate anatomic reduction. Virtual surgery planning (VSP) can provide improved accuracy and shorter operating times, but is often not employed for trauma cases because of time constraints and complex user interfaces limited to two-dimensional interaction with three-dimensional data. In this study, we evaluate the accuracy, precision, and time efficiency of the Haptic Assisted Surgery Planning system (HASP), an in-house VSP system that supports stereo graphics, six degrees-of-freedom input and haptics, to improve the surgical planning. Three operators performed planning in HASP on Computed Tomography (CT) and Come Beam Computed Tomography (CBCT) images of a plastic skull model and on twelve retrospective cases with complex mandible fractures. The result shows an accuracy and reproducibility of less than 2mm when using HASP, with an average planning time of 15 minutes, including time for segmentation in the software BoneSplit. This study presents an in-house haptic assisted planning tool for cranio-maxillofacial surgery with high usability that can be used for preoperative planning and evaluation of complex mandible fractures.

    Haptic-Assisted Surgical Planning (HASP) in a Case of Bilateral Mandible Fracture

    No full text
    Restoring normal skeletal anatomy in patients with complex trauma to the mandible can be difficult, the difficulty often increasing with an edentulous mandible. This study describes a case of a displaced edentulous bilateral mandibular fracture, which was preoperatively planned with the in-house haptic-assisted surgery planning system (HASP). A model of the virtually restored mandible was 3D-printed at the hospital and a reconstruction plate was outlined beforehand with the printed mandible as a template and served as a guide during surgery. This case suggests HASP as a valuable preoperative tool in the planning phase when dealing with maxillofacial trauma cases. With the application of virtual planning, the authors could analyze the desired outcome and were further supported in surgery by the guidance of the reconstruction plate outlined on the restored model of the mandible
    corecore