6,857 research outputs found
Interactive Visual Analytics for Large-scale Particle Simulations
Particle based model simulations are widely used in scientific visualization. In cosmology, particles are used to simulate the evolution of dark matter in the universe. Clusters of particles (that have special statistical properties) are called halos. From a visualization point of view, halos are clusters of particles, each having a position, mass and velocity in three dimensional space, and they can be represented as point clouds that contain various structures of geometric interest such as filaments, membranes, satellite of points, clusters, and cluster of clusters.
The thesis investigates methods for interacting with large scale data-sets represented as point clouds. The work mostly aims at the interactive visualization of cosmological simulation based on large particle systems. The study consists of three components: a) two human factors experiments into the perceptual factors that make it possible to see features in point clouds; b) the design and implementation of a user interface making it possible to rapidly navigate through and visualize features in the point cloud, c) software development and integration to support visualization
Cinema Darkroom: A Deferred Rendering Framework for Large-Scale Datasets
This paper presents a framework that fully leverages the advantages of a
deferred rendering approach for the interactive visualization of large-scale
datasets. Geometry buffers (G-Buffers) are generated and stored in situ, and
shading is performed post hoc in an interactive image-based rendering front
end. This decoupled framework has two major advantages. First, the G-Buffers
only need to be computed and stored once---which corresponds to the most
expensive part of the rendering pipeline. Second, the stored G-Buffers can
later be consumed in an image-based rendering front end that enables users to
interactively adjust various visualization parameters---such as the applied
color map or the strength of ambient occlusion---where suitable choices are
often not known a priori. This paper demonstrates the use of Cinema Darkroom on
several real-world datasets, highlighting CD's ability to effectively decouple
the complexity and size of the dataset from its visualization
Visuelle Analyse großer Partikeldaten
Partikelsimulationen sind eine bewährte und weit verbreitete numerische Methode in der Forschung und Technik. Beispielsweise werden Partikelsimulationen zur Erforschung der Kraftstoffzerstäubung in Flugzeugturbinen eingesetzt. Auch die Entstehung des Universums wird durch die Simulation von dunkler Materiepartikeln untersucht. Die hierbei produzierten Datenmengen sind immens. So enthalten aktuelle Simulationen Billionen von Partikeln, die sich über die Zeit bewegen und miteinander interagieren. Die Visualisierung bietet ein großes Potenzial zur Exploration, Validation und Analyse wissenschaftlicher Datensätze sowie der zugrundeliegenden
Modelle. Allerdings liegt der Fokus meist auf strukturierten Daten mit einer regulären Topologie. Im Gegensatz hierzu bewegen sich Partikel frei durch Raum und Zeit. Diese Betrachtungsweise ist aus der Physik als das lagrange Bezugssystem bekannt. Zwar können Partikel aus dem lagrangen in ein reguläres eulersches Bezugssystem, wie beispielsweise in ein uniformes Gitter, konvertiert werden. Dies ist bei einer großen Menge an Partikeln jedoch mit einem erheblichen Aufwand verbunden. Darüber hinaus führt diese Konversion meist zu einem Verlust der Präzision bei gleichzeitig erhöhtem Speicherverbrauch. Im Rahmen dieser Dissertation werde ich neue Visualisierungstechniken erforschen, welche speziell auf der lagrangen Sichtweise basieren. Diese ermöglichen eine effiziente und effektive visuelle Analyse großer Partikeldaten
Void-and-Cluster Sampling of Large Scattered Data and Trajectories
We propose a data reduction technique for scattered data based on statistical
sampling. Our void-and-cluster sampling technique finds a representative subset
that is optimally distributed in the spatial domain with respect to the blue
noise property. In addition, it can adapt to a given density function, which we
use to sample regions of high complexity in the multivariate value domain more
densely. Moreover, our sampling technique implicitly defines an ordering on the
samples that enables progressive data loading and a continuous level-of-detail
representation. We extend our technique to sample time-dependent trajectories,
for example pathlines in a time interval, using an efficient and iterative
approach. Furthermore, we introduce a local and continuous error measure to
quantify how well a set of samples represents the original dataset. We apply
this error measure during sampling to guide the number of samples that are
taken. Finally, we use this error measure and other quantities to evaluate the
quality, performance, and scalability of our algorithm.Comment: To appear in IEEE Transactions on Visualization and Computer Graphics
as a special issue from the proceedings of VIS 201
ParaView + Alya + D8tree: Integrating high performance computing and high performance data analytics
Large scale time-dependent particle simulations can generate massive amounts of data, making it so that storing the results is often the slowest phase and the primary time bottleneck of the simulation. Furthermore, analysing this amount of data with traditional tools has become increasingly challenging, and it is often virtually impossible to have a visual representation of the full set.
We propose a novel architecture that integrates an HPC-based multi-physics simulation code, a NoSQL database, and a data analysis and visualisation application. The goals are two: On the one hand, we aim to speed up the simulations taking advantage of the scalability of key-value data stores, while at the same time enabling real-time approximated data visualisation and interactive exploration. On the other hand, we want to make it efficient to explore and analyse the large data base of results produced. Therefore, this work represents a clear example of integrating High Performance Computing with High Performance Data Analytics. Our prototype proves the validity of our approach and shows great performance improvements. Indeed, we reduced by 67.5% the time to store the simulation while we made real-time queries run 52 times faster than alternative solutions.This work has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 720270 (HBP SGA1). It is also partially supported by the grant SEV-2011-00067 of Severo Ochoa Program, the TIN2015-65316-P project, with funding from the Spanish Ministry of Economy and Competitivity, the European Union FEDER funds, and the SGR 2014-SGR-1051.Peer ReviewedPostprint (published version
Stochastic Volume Rendering of Multi-Phase SPH Data
In this paper, we present a novel method for the direct volume rendering of large smoothed‐particle hydrodynamics (SPH) simulation data without transforming the unstructured data to an intermediate representation. By directly visualizing the unstructured particle data, we avoid long preprocessing times and large storage requirements. This enables the visualization of large, time‐dependent, and multivariate data both as a post‐process and in situ. To address the computational complexity, we introduce stochastic volume rendering that considers only a subset of particles at each step during ray marching. The sample probabilities for selecting this subset at each step are thereby determined both in a view‐dependent manner and based on the spatial complexity of the data. Our stochastic volume rendering enables us to scale continuously from a fast, interactive preview to a more accurate volume rendering at higher cost. Lastly, we discuss the visualization of free‐surface and multi‐phase flows by including a multi‐material model with volumetric and surface shading into the stochastic volume rendering
- …