17 research outputs found
Methoden der Partikelverfolgung in der Visualisierung und der Computergrafik
This thesis discusses the broad variety of particle tracing algorithms with focus on flow visualization. Starting with a general overview of the basics of visualization and computer graphics, mathematics, and fluid dynamics, a number of methods using particle tracing for flow visualization and computer graphics are proposed. The first part of this thesis considers mostly texture-based techniques that are implemented on the graphics processing unit (GPU) in order to provide an interactive dense representation of 3D flow fields. This part considers particle tracing methods that can be applied on general vector fields and includes texture based visualization in volumes as well as on surfaces. Furthermore, it is described how particle tracing can be used for extracting flow structures, like path surfaces, of the given vector field.
The second part of this thesis considers particle tracing on derived vector fields for flow visualization. Therefore, first a feature extraction criterion is applied on a fluid flow field. In most cases this results in a scalar field serving as base for the particle tracing methods. Here, it is shown how higher order derivatives of scalar fields can be used to extract flow features like 1D vortex core lines or 2D shear sheets. The extracted structures are further processed in terms of feature tracking.
The third part generalizes particle tracing for arbitrary applications in visualization and computer graphics. Here, the particles' path either might be defined by the perspective of the human eye or by a force field that influences the particles' motion by considering second order ordinary differential equations.
All three parts clarify the importance of particle tracing methods for a wide range of applications in flow visualization and computer graphics by various examples. Furthermore, it is shown how the flexibility of this method strongly depends on the underlying vector field, and how those vector fields can be generated in order to solve problems that go beyond traditional particle tracing in fluid flow fields.Diese Doktorarbeit behandelt die verschiedenen Anwendungsmöglichkeiten der Partikelverfolgung. Den Schwerpunkt bildet hierbei der Einsatz von Algorithmen für die Partikelverfolgung in der Strömungsvisualisierung. Die Arbeit beginnt mit einer Einführung in die Grundlagen verschiedener Themenbereiche wie Visualisierung, Computergraphik, Mathematik und Strömungsmechanik. Auf diesen Techniken aufbauend werden in den folgenden Kapiteln mehrere Methoden zur Verwendung der Partikelverfolgung im Bereich der Visualisierung und der Computergraphik vorgestellt. Der erste Teil dieser Arbeit befasst sich dabei vor allem mit texturbasierten Techniken, welche, um eine interaktive Dichterepräsentation dreidimensionaler Vektorfelder zu gewährleisten, auf Grafikhardware (GPU = Graphics Processing Unit) umgesetzt wurden. Dabei zielen die in diesem Teil der Arbeit vorgestellten Methoden auf allgemeine Vektorfelder ab, welche mit Hilfe texturbasierter Volumen- und Oberflächendarstellungen repräsentiert werden sollen. Desweiteren wird beschrieben, wie diese Methoden dazu verwendet werden können, Strukturen aus Strömungsfeldern zu extrahieren, wie z.B. zeitabhängige Strömungsflächen.
Der zweite Teil dieser Arbeit behandelt die Partikelverfolgung in abgeleiteten Strömungsfeldern, wie sie in der Strömungsvisualisierung zu finden sind. Dabei werden zuerst mittels Methoden der Strömungsmechanik sogenannte Strömungsmerkmale aus dem Strömungsfeld extrahiert, welche dann als Ausgangsbasis für die Verwendung der Partikelverfolgung herangezogen werden. Die Datenrepräsentation der Strömungsmerkmale erfolgt in Skalarfeldern, wobei deren Ableitungen höherer Ordnung wiederum zur Berechnung von charakteristischen Strukturen, wie z.B. 1D Wirbelkernen oder 2D Scherschichtstrukturen verwendet werden. Die extrahierten Strukturen werden außerdem auf ihre zeitliche Veränderungen untersucht, indem sie entlang der Zeit verfolgt werden.
Im dritten Teil wird die Anwendung der Partikelverfolgung in verschiedenen Bereichen der Visualisierung und der Computergraphik diskutiert. Hierfür werden die Partikelbahnen entweder über die Perspektive des menschlichen Auges oder über Kraftfelder beschrieben, welche die Partikelbewegung durch Differentialgleichungen zweiter Ordnung beeinflussen.
Jeder dieser drei Teile der Arbeit versucht die wichtige Rolle der Partikelverfolgung in einem breiten Spektrum von Anwendungen in der Visualisierung und der Computergraphik anhand verschiedener Beispiele zu verdeutlichen. Desweiteren wird dadurch die Flexibilität dieser Methode durch Verwendung verschiedener Vektorfelder verdeutlicht und gezeigt, wie diese generiert werden, um auch Probleme jenseits der klassischen Partikelverfolgung in Strömungsfeldern zu lösen
INTERACTIVE INVESTIGATION AND VISUALIZATION OF 3D VORTEX STRUCTURES
detection, graphics hardware In this paper, we present a fast and accurate method for the combined detection and interactive visualization of vortex structures in 3D flow and their corresponding velocity field. Vortex surfaces are extracted by identifying vortex core lines in combination with isosurfaces of λ2. The velocity field on those vortex surfaces is visualized by a surface LIC method that works in image space. An efficient GPU implementation of surface LIC supports interactive visualization and, therefore, a user-centered exploration of the behavior of the flow in vortical regions is possible. The surface LIC is further improved by applying several illumination techniques. The visualization system also supports the combination of this new visual representation with conventional techniques such as vector arrows, geometric streamlines, and color coding of parameters like vortex strength or velocity magnitude. Furthermore, a detailed investigation of vortices is facilitated by selectively picking vortices, which allows the user to focus on certain areas of interest.
Real-time rendering of planets with atmospheres
This paper presents a real time technique for planetary rendering and atmospheric scattering effects. Our implementation
is based on Nishita’s atmospheric model which describes actual physical phenomena, taking into account air molecules and
aerosols, and on a continuous level-of-detail planetary renderer. We obtain interactive frame rates by combining the CPU
bound spherical terrain rendering with the GPU computation of the atmospheric scattering. In contrast to volume rendering
approaches, the parametrization of the light attenuation integral we use makes it possible to pre-compute it completely. The
GPU is used for determining the texture coordinates of the pre computed 3D texture, taking into account the actual spatial
parameters. Our approach benefits from its independence of the rendered terrain geometry. Therefore, we demonstrate the
utility of our approach showing planetary renderings of Earth and Mars
Real-Time Advection and Volumetric Illumination for the Visualization of 3D Unsteady Flow
This paper presents an interactive technique for the dense texture-based visualization of unsteady 3D flow, taking into account issues of computational efficiency and visual perception. High efficiency is achieved by a novel 3D GPU-based texture advection mechanism that implements logical 3D grid structures by physical memory in the form of 2D textures. This approach results in fast read and write access to physical memory, independent of GPU architecture. Slice-based direct volume rendering is used for the final display. A real-time computation of gradients is employed to achieve volume illumination. Perception-guided volume shading methods are included, such as halos, cool/warm shading, or color-based depth cueing. The problems of clutter and occlusion are addressed by supporting a volumetric importance function that enhances features of the flow and reduces visual complexity in less interesting regions
Real-time classification of ground from LIDAR data for helicopter navigation
Helicopter pilots often have to deal with bad weather conditions and degraded views. Such situations may decrease the pilots' situational awareness significantly. The worst-case scenario would be a complete loss of visual reference during an off-field landing due to brownout or white out. In order to increase the pilots' situational awareness, helicopters nowadays are equipped with different sensors that are used to gather information about the terrain ahead of the helicopter. Synthetic vision systems are used to capture and classify sensor data and to visualize them on multifunctional displays or pilot's head up displays. This requires the input data to be a reliably classified into obstacles and ground. In this paper, we present a regularization-based terrain classifier. Regularization is a popular segmentation method in computer vision and used in active contours. For a real-time application scenario with LIDAR data, we developed an optimization that uses different levels of detail depending on the accuracy of the sensor. After a preprocessing step where points are removed that cannot be ground, the method fits a shape underneath the recorded point cloud. Once this shape is calculated, the points below this shape can be distinguished from elevated objects and are classified as ground. Finally, we demonstrate the quality of our segmentation approach by its application on operational flight recordings. This method builds a part of an entire synthetic vision processing chain, where the classified points are used to support the generation of a real-time synthetic view of the terrain as an assistance tool for the helicopter pilot
Hardware-accelerated point-based rendering of surfaces and volumes
In this paper, we present a fast GPU-based algorithm for ray-tracing point-based models, which includes an efficient computation
of secondary and shadow rays, contrary to previous work which supported ray-surface intersections for primary rays only.
Volumetric effects are added to the models by means of scattered data interpolation in order to combine point-based surface
and volume rendering in the same scene. This allows us to obtain effects such as refraction within volumetric objects. The
flexibility of our method is demonstrated by combining shadows, textured objects, refraction and volumetric effects in the same
scene comprised uniquely by point sets
Panorama maps with non-linear ray tracing
left valley are straightened to minimize occlusion. The area including a lake which is occluded by the middle mountain ridge on the left image becomes almost completely apparent. In the background the mountain summits are less striking through the use of a progressive perspective. We present a framework for the interactive generation of 3D panorama maps. Our approach addresses the main issue that occurs during panorama map construction: non-linear projection or deformation of the terrain in order to minimize the occlusion of important information such as roads and trails. Traditionally, panorama maps are hand-drawn by skilled illustrators. In contrast, our approach provides computer support for the rendering of non-occluded views of 3D panorama maps, where deformations are modeled by nonlinear ray tracing. The deflection of rays is influenced by 2D and 3D force fields that directly consider the shape of the terrain. In addition, our framework allows the user to further modify the force fields to have fine control over the deformations of the panorama map. User interaction is facilitated by our real-time rendering system in terms of linked multiple views of both linear and non-linear projected terrain and the deformed view rays. Fast rendering is achieved by GPU-based non-linear ray tracing. We demonstrate the usefulness of our modeling and visualization method by several examples