1,203 research outputs found
Hierarchical Structure of Magnetohydrodynamic Turbulence In Position-Position-Velocity Space
Magnetohydrodynamic turbulence is able to create hierarchical structures in
the interstellar medium that are correlated on a wide range of scales via the
energy cascade. We use hierarchical tree diagrams known as dendrograms to
characterize structures in synthetic Position-Position-Velocity (PPV) emission
cubes of optically thin isothermal magnetohydrodynamic turbulence. We show that
the structures and degree of hierarchy observed in PPV space are related to the
physics of the gas, i.e. self-gravity and the global sonic and Alfvenic Mach
number. Simulations with higher Alfvenic Mach number, self-gravity and
supersonic flows display enhanced hierarchical structure. We observed a strong
sonic and Alfvenic dependency when we apply the the statistical moments (i.e.
mean, variance, skewness, kurtosis) to the dendrogram distribution. Larger
magnetic field and sonic Mach number correspond to larger values of the
moments. Application of the dendrogram to 3D density cubes, also known as
Position-Position-Position cubes (PPP), reveals that the dominant emission
contours in PPP and PPV are related for supersonic gas but not for subsonic. We
also explore the effects of smoothing, thermal broadening and velocity
resolution on the dendrograms in order to make our study more applicable to
observational data. These results all point to hierarchical tree diagrams as
being a promising additional tool for studying ISM turbulence and star forming
regions in the direction of obtaining information on the degree of
self-gravity, the Mach numbers and the complicated relationship between PPV and
PPP.Comment: submitted to Ap
Pushing the Limits of 3D Color Printing: Error Diffusion with Translucent Materials
Accurate color reproduction is important in many applications of 3D printing,
from design prototypes to 3D color copies or portraits. Although full color is
available via other technologies, multi-jet printers have greater potential for
graphical 3D printing, in terms of reproducing complex appearance properties.
However, to date these printers cannot produce full color, and doing so poses
substantial technical challenges, from the shear amount of data to the
translucency of the available color materials. In this paper, we propose an
error diffusion halftoning approach to achieve full color with multi-jet
printers, which operates on multiple isosurfaces or layers within the object.
We propose a novel traversal algorithm for voxel surfaces, which allows the
transfer of existing error diffusion algorithms from 2D printing. The resulting
prints faithfully reproduce colors, color gradients and fine-scale details.Comment: 15 pages, 14 figures; includes supplemental figure
Doctor of Philosophy
dissertationHigh-order finite element methods, using either the continuous or discontinuous Galerkin formulation, are becoming more popular in fields such as fluid mechanics, solid mechanics and computational electromagnetics. While the use of these methods is becoming increasingly common, there has not been a corresponding increase in the availability and use of visualization methods and software that are capable of displaying visualizations of these volumes both accurately and interactively. A fundamental problem with the majority of existing visualization techniques is that they do not understand nor respect the structure of a high-order field, leading to visualization error. Visualizations of high-order fields are generally created by first approximating the field with low-order primitives and then generating the visualization using traditional methods based on linear interpolation. The approximation step introduces error into the visualization pipeline, which requires the user to balance the competing goals of image quality, interactivity and resource consumption. In practice, visualizations performed this way are often either undersampled, leading to visualization error, or oversampled, leading to unnecessary computational effort and resource consumption. Without an understanding of the sources of error, the simulation scientist is unable to determine if artifacts in the image are due to visualization error, insufficient mesh resolution, or a failure in the underlying simulation. This uncertainty makes it difficult for the scientists to make judgments based on the visualization, as judgments made on the assumption that artifacts are a result of visualization error when they are actually a more fundamental problem can lead to poor decision-making. This dissertation presents new visualization algorithms that use the high-order data in its native state, using the knowledge of the structure and mathematical properties of these fields to create accurate images interactively, while avoiding the error introduced by representing the fields with low-order approximations. First, a new algorithm for cut-surfaces is presented, specifically the accurate depiction of colormaps and contour lines on arbitrarily complex cut-surfaces. Second, a mathematical analysis of the evaluation of the volume rendering integral through a high-order field is presented, as well as an algorithm that uses this analysis to create accurate volume renderings. Finally, a new software system, the Element Visualizer (ElVis), is presented, which combines the ideas and algorithms created in this dissertation in a single software package that can be used by simulation scientists to create accurate visualizations. This system was developed and tested with the assistance of the ProjectX simulation team. The utility of our algorithms and visualization system are then demonstrated with examples from several high-order fluid flow simulations
ElVis: A system for the accurate and interactive visualization of high-order finite element solutions
pre-printThis paper presents the Element Visualizer (ElVis), a new, open-source scientific visualization system for use with high order finite element solutions to PDEs in three dimensions. This system is designed to minimize visualization errors of these types of fields by querying the underlying finite element basis functions (e.g., high-order polynomials) directly, leading to pixel-exact representations of solutions and geometry. The system interacts with simulation data through run time plugins, which only require users to implement a handful of operations fundamental to finite element solvers. The data in turn can be visualized through the use of cut surfaces, contours, isosurfaces, and volume rendering. These visualization algorithms are implemented using NVIDIA's OptiX GPU-based ray-tracing engine, which provides accelerated ray traversal of the high-order geometry, and CUDA, which allows for effective parallel evaluation of the visualization algorithms. The direct interface between ElVis and the underlying data differentiates it from existing visualization tools. Current tools assume the underlying data is composed of linear primitives; high-order data must be interpolated with linear functions as a result. In this work, examples drawn from aerodynamic simulations-high-order discontinuous Galerkin finite element solutions of aerodynamic flows in particular-will demonstrate the superiority of ElVis' pixel-exact approach when compared with traditional linear-interpolation methods. Such methods can introduce a number of inaccuracies in the resulting visualization, making it unclear if visual artifacts are genuine to the solution data or if these artifacts are the result of interpolation errors. Linear methods additionally cannot properly visualize curved geometries (elements or boundaries) which can greatly inhibit developers' debugging efforts. As we will show, pixel-exact visualization exhibits none of these issues, removing the visualization scheme as a source of uncertainty for engineers using ElVis
Recommended from our members
Application of electron tomography for comprehensive determination of III-V interface properties
We present an electron tomography method for the comprehensive characterization of buried III-V semiconductor interfaces that is based on chemical-sensitive high-angle annular dark-field scanning transmission electron microscopy. For this purpose, an (Al,Ga)As/GaAs multi-layer system grown by molecular beam epitaxy is used as a case study. Isoconcentration surfaces are exploited to obtain topographic height maps of 120 nm × 120 nm area, revealing the interface morphology. By applying the height-height correlation function, we are able to determine important interface properties like root mean square roughness and lateral correlation length of various interfaces of the (Al,Ga)As/GaAs system characterized by different Al concentrations. Height-difference maps based on isosurfaces corresponding to 30% and 70% of the total compositional difference at the interfaces are used to create topographic maps of the interface width and to calculate an average interface width. This methodology proves differences in the properties of direct and inverted interfaces and allows the observation of interfacial anisotropies. © 202
What May Visualization Processes Optimize?
In this paper, we present an abstract model of visualization and inference
processes and describe an information-theoretic measure for optimizing such
processes. In order to obtain such an abstraction, we first examined six
classes of workflows in data analysis and visualization, and identified four
levels of typical visualization components, namely disseminative,
observational, analytical and model-developmental visualization. We noticed a
common phenomenon at different levels of visualization, that is, the
transformation of data spaces (referred to as alphabets) usually corresponds to
the reduction of maximal entropy along a workflow. Based on this observation,
we establish an information-theoretic measure of cost-benefit ratio that may be
used as a cost function for optimizing a data visualization process. To
demonstrate the validity of this measure, we examined a number of successful
visualization processes in the literature, and showed that the
information-theoretic measure can mathematically explain the advantages of such
processes over possible alternatives.Comment: 10 page
Doctor of Philosophy
dissertationIn this dissertation, we advance the theory and practice of verifying visualization algorithms. We present techniques to assess visualization correctness through testing of important mathematical properties. Where applicable, these techniques allow us to distinguish whether anomalies in visualization features can be attributed to the underlying physical process or to artifacts from the implementation under verification. Such scientific scrutiny is at the heart of verifiable visualization - subjecting visualization algorithms to the same verification process that is used in other components of the scientific pipeline. The contributions of this dissertation are manifold. We derive the mathematical framework for the expected behavior of several visualization algorithms, and compare them to experimentally observed results in the selected codes. In the Computational Science & Engineering community CS&E, this technique is know as the Method of Manufactured Solution (MMS). We apply MMS to the verification of geometrical and topological properties of isosurface extraction algorithms, and direct volume rendering. We derive the convergence of geometrical properties of isosurface extraction techniques, such as function value and normals. For the verification of topological properties, we use stratified Morse theory and digital topology to design algorithms that verify topological invariants. In the case of volume rendering algorithms, we provide the expected discretization errors for three different error sources. The results of applying the MMS is another important contribution of this dissertation. We report unexpected behavior for almost all implementations tested. In some cases, we were able to find and fix bugs that prevented the correctness of the visualization algorithm. In particular, we address an almost 2 0 -year-old bug with the core disambiguation procedure of Marching Cubes 33, one of the first algorithms intended to preserve the topology of the trilinear interpolant. Finally, an important by-product of this work is a range of responses practitioners can expect to encounter with the visualization technique under verification
Negative magnetic eddy diffusivities from test-field method and multiscale stability theory
The generation of large-scale magnetic field in the kinematic regime in the
absence of an alpha-effect is investigated by following two different
approaches, namely the test-field method and multiscale stability theory
relying on the homogenisation technique. We show analytically that the former,
applied for the evaluation of magnetic eddy diffusivities, yields results that
fully agree with the latter. Our computations of the magnetic eddy diffusivity
tensor for the specific instances of the parity-invariant flow-IV of G.O.
Roberts and the modified Taylor-Green flow in a suitable range of parameter
values confirm the findings of previous studies, and also explain some of their
apparent contradictions. The two flows have large symmetry groups; this is used
to considerably simplify the eddy diffusivity tensor. Finally, a new analytic
result is presented: upon expressing the eddy diffusivity tensor in terms of
solutions to auxiliary problems for the adjoint operator, we derive relations
between magnetic eddy diffusivity tensors that arise for opposite small-scale
flows v(x) and -v(x).Comment: 29 pp., 19 figures, 42 reference
- …