71 research outputs found
Computational field visualization
ManuscriptToday, scientists, engineers, and medical researchers routinely use computers to simulate complex physical phenomena. Such simulations present new challenges for computational scientists, including the need to effectively analyze and visualize complex three-dimensional data. As simulations become more complex and produce larger amounts of data, the effectiveness of utilizing such high resolution data will hinge upon the ability of human experts to interact with their data and extract useful information. Here we describe recent work at the SCI Institute in large-scale scalar, vector, and tensor visualization techniques. We end with a discussion of ideas for the integration of techniques for creating computational multi-field visualizations
Haptically assisted connection procedure for the reconstruction of dendritic spines
Dendritic spines are thin protrusions that cover the dendritic surface of numerous neurons in the brain and whose function seems to play a key role in neural circuits. The correct segmentation of those structures is difficult due to their small size and the resulting spines can appear incomplete. This paper presents a four-step procedure for the complete reconstruction of dendritic spines. The haptically driven procedure is intended to work as an image processing stage before the automatic segmentation step giving the final representation of the dendritic spines. The procedure is designed to allow both the navigation and the volume image editing to be carried out using a haptic device. A use case employing our procedure together with a commercial software package for the segmentation stage is illustrated. Finally, the haptic editing is evaluated in two experiments; the first experiment concerns the benefits of the force feedback and the second checks the suitability of the use of a haptic device as input. In both cases, the results shows that the procedure improves the editing accuracy
Real-time hybrid cutting with dynamic fluid visualization for virtual surgery
It is widely accepted that a reform in medical teaching must be made to meet today's high volume training requirements. Virtual simulation offers a potential method of providing such trainings and some current medical training simulations integrate haptic and visual feedback to enhance procedure learning. The purpose of this project is to explore the capability of Virtual Reality (VR) technology to develop a training simulator for surgical cutting and bleeding in a general surgery
MTCut: GPU-based marching tetra cuts
Isosurface construction and rendering based on tetrahedral grids has shown to be feasible on programmable graphics hardware. In this paper we present MTCut: a volume cutting algorithm that is able to cut isosurfaces obtained by a Marching Tetrahedra algorithm on volume data. It does not require a tetrahedal representation and runs in real time for complex meshes of up to 1.8M triangles. Our algorithm takes as input the isosurface to be cut, slices it, and produces the cut geometry in response to the user interaction with a haptic device. The result is a watertight manifold that can be interactively recovered back to CPU in response to a user request.Postprint (published version
Haptic Interaction with 3D oriented point clouds on the GPU
Real-time point-based rendering and interaction with virtual objects is gaining popularity
and importance as di�erent haptic devices and technologies increasingly provide the basis
for realistic interaction. Haptic Interaction is being used for a wide range of applications
such as medical training, remote robot operators, tactile displays and video games. Virtual
object visualization and interaction using haptic devices is the main focus; this process
involves several steps such as: Data Acquisition, Graphic Rendering, Haptic Interaction
and Data Modi�cation. This work presents a framework for Haptic Interaction using the
GPU as a hardware accelerator, and includes an approach for enabling the modi�cation
of data during interaction. The results demonstrate the limits and capabilities of these
techniques in the context of volume rendering for haptic applications. Also, the use
of dynamic parallelism as a technique to scale the number of threads needed from the
accelerator according to the interaction requirements is studied allowing the editing of
data sets of up to one million points at interactive haptic frame rates
Graphic Processing Units (GPUs)-Based Haptic Simulator for Dental Implant Surgery
This paper presents a haptics-based training simulator for dental implant surgery. Most of the previously developed dental simulators are targeted for exploring and drilling purpose only. The penalty-based contact force models with spherical-shaped dental tools are often adopted for simplicity and computational efficiency. In contrast, our simulator is equipped with a more precise force model adapted from the Voxmap-PointShell (VPS) method to capture the essential features of the drilling procedure, with no limitations on drill shape. In addition, a real-time torque model is proposed to simulate the torque resistance in the implant insertion procedure, based on patient-specific tissue properties and implant geometry. To achieve better anatomical accuracy, our oral model is reconstructed from cone beam computed tomography (CBCT) images with a voxel-based method. To enhance the real-time response, the parallel computing power of GPUs is exploited through extra efforts in data structure design, algorithms parallelization, and graphic memory utilization. Results show that the developed system can produce appropriate force feedback at different tissue layers during pilot drilling and can create proper resistance torque responses during implant insertion
Isosurface extraction and haptic rendering of volumetric data.
Kwong-Wai, Chen.Thesis (M.Phil.)--Chinese University of Hong Kong, 2000.Includes bibliographical references (leaves 114-118).Abstracts in English and Chinese.Abstract --- p.iAcknowledgments --- p.iiiChapter 1 --- Introduction --- p.1Chapter 1.1 --- Volumetric Data --- p.1Chapter 1.2 --- Volume Visualization --- p.4Chapter 1.3 --- Thesis Contributions --- p.5Chapter 1.4 --- Thesis Outline --- p.6Chapter I --- Multi-body Surface Extraction --- p.8Chapter 2 --- Isosurface Extraction --- p.9Chapter 2.1 --- Previous Works --- p.10Chapter 2.1.1 --- Marching Cubes --- p.10Chapter 2.1.2 --- Skeleton Climbing --- p.12Chapter 2.1.3 --- Adaptive Skeleton Climbing --- p.14Chapter 2.2 --- Motivation --- p.17Chapter 3 --- Multi-body Surface Extraction --- p.19Chapter 3.1 --- Multi-body Surface --- p.19Chapter 3.2 --- Building 0-skeleton --- p.21Chapter 3.3 --- Building 1-skeleton --- p.23Chapter 3.3.1 --- Non-binary Faces --- p.24Chapter 3.3.2 --- Non-binary Cubes --- p.30Chapter 3.4 --- General Scheme for Messy Cubes --- p.33Chapter 3.4.1 --- Graph Reduction --- p.34Chapter 3.4.2 --- Position of the Tetrapoints --- p.36Chapter 3.5 --- Triangular Mesh Generation --- p.37Chapter 3.5.1 --- Generating the Edge Loops --- p.38Chapter 3.5.2 --- Triangulating the Edge Loops --- p.41Chapter 3.5.3 --- Incorporating with Adaptive Skeleton Climbing --- p.43Chapter 3.6 --- Implementation and Results --- p.45Chapter II --- Haptic Rendering of Volumetric Data --- p.60Chapter 4 --- Introduction to Haptics --- p.61Chapter 4.1 --- Terminology --- p.62Chapter 4.2 --- Haptic Rendering Process --- p.63Chapter 4.2.1 --- The Overall Process --- p.64Chapter 4.2.2 --- Force Profile --- p.65Chapter 4.2.3 --- Decoupling Processes --- p.66Chapter 4.3 --- The PHANToM´ёØ Haptic Interface --- p.67Chapter 4.4 --- Research Goals --- p.69Chapter 5 --- Haptic Rendering of Geometric Models --- p.70Chapter 5.1 --- Penalty Based Methods --- p.71Chapter 5.1.1 --- Vector Fields for Solid Objects --- p.71Chapter 5.1.2 --- Drawbacks of Penalty Based Methods --- p.72Chapter 5.2 --- Constraint Based Methods --- p.73Chapter 5.2.1 --- Virtual Haptic Interface Point --- p.73Chapter 5.2.2 --- The Constraints --- p.74Chapter 5.2.3 --- Location Computation --- p.78Chapter 5.2.4 --- Force Shading --- p.79Chapter 5.2.5 --- Adding Surface Properties --- p.80Chapter 6 --- Haptic Rendering of Volumetric Data --- p.83Chapter 6.1 --- Volume Haptization --- p.84Chapter 6.2 --- Isosurface Haptic Rendering --- p.86Chapter 6.3 --- Intermediate Representation Approach --- p.89Chapter 6.3.1 --- Introduction --- p.89Chapter 6.3.2 --- Intermediate Virtual Plane --- p.90Chapter 6.3.3 --- Updating Virtual Plane --- p.92Chapter 6.3.4 --- Preventing Force Discontinuity Artifacts --- p.93Chapter 6.3.5 --- Experiments and Results --- p.94Chapter 7 --- Conclusions and Future Research Directions --- p.98Chapter 7.1 --- Conclusions --- p.98Chapter 7.2 --- Future Research Directions --- p.99Chapter A --- Two Proofs of Multi-body Surface Extraction Algorithm --- p.101Chapter A.1 --- Graph Terminology and Theorems --- p.101Chapter A.2 --- Occurrence of Tripoints in Negative-Positive Pairs --- p.103Chapter A.3 --- Validity of the General Scheme --- p.103Chapter B --- An Example of Multi-body Surface Extraction Algorithm --- p.105Chapter B.1 --- Step 1: Building 0-Skeleton --- p.105Chapter B.2 --- Step 2: Building 1-Skeleton --- p.106Chapter B.2.1 --- Step 2a: Building 1-Skeleton and Tripoints on Cube Faces --- p.106Chapter B.2.2 --- Step 2b: Adding Tetrapoints and Tri-edges inside Cube --- p.106Chapter B.3 --- Step 3: Constructing Edge Loops and Triangulating --- p.109Bibliography --- p.11
Modeling and rendering for development of a virtual bone surgery system
A virtual bone surgery system is developed to provide the potential of a realistic, safe, and controllable environment for surgical education. It can be used for training in orthopedic surgery, as well as for planning and rehearsal of bone surgery procedures...Using the developed system, the user can perform virtual bone surgery by simultaneously seeing bone material removal through a graphic display device, feeling the force via a haptic deice, and hearing the sound of tool-bone interaction --Abstract, page iii
- …