11 research outputs found

    Topology for statistical modeling of petascale data.

    Full text link

    Void-and-Cluster Sampling of Large Scattered Data and Trajectories

    Full text link
    We propose a data reduction technique for scattered data based on statistical sampling. Our void-and-cluster sampling technique finds a representative subset that is optimally distributed in the spatial domain with respect to the blue noise property. In addition, it can adapt to a given density function, which we use to sample regions of high complexity in the multivariate value domain more densely. Moreover, our sampling technique implicitly defines an ordering on the samples that enables progressive data loading and a continuous level-of-detail representation. We extend our technique to sample time-dependent trajectories, for example pathlines in a time interval, using an efficient and iterative approach. Furthermore, we introduce a local and continuous error measure to quantify how well a set of samples represents the original dataset. We apply this error measure during sampling to guide the number of samples that are taken. Finally, we use this error measure and other quantities to evaluate the quality, performance, and scalability of our algorithm.Comment: To appear in IEEE Transactions on Visualization and Computer Graphics as a special issue from the proceedings of VIS 201

    Visual Analysis of Multiple Dynamic Sensitivities along Ascending Trajectories in the Atmosphere

    Full text link
    Numerical weather prediction models rely on parameterizations for subgrid-scale processes, e.g., for cloud microphysics. These parameterizations are a well-known source of uncertainty in weather forecasts that can be quantified via algorithmic differentiation, which computes the sensitivities of prognostic variables to changes in model parameters. It is particularly interesting to use sensitivities to analyze the validity of physical assumptions on which microphysical parameterizations in the numerical model source code are based. In this article, we consider the use case of strongly ascending trajectories, so-called warm conveyor belt trajectories, known to have a significant impact on intense surface precipitation rates in extratropical cyclones. We present visual analytics solutions to analyze interactively the sensitivities of a selected prognostic variable, i.e. rain mass density, to multiple model parameters along such trajectories. We propose a visual interface that enables to a) compare the values of multiple sensitivities at a single time step on multiple trajectories, b) assess the spatio-temporal relationships between sensitivities and the shape and location of trajectories, and c) a comparative analysis of the temporal development of sensitivities along multiple trajectories. We demonstrate how our approach enables atmospheric scientists to interactively analyze the uncertainty in the microphysical parameterizations, and along the trajectories, with respect to a selected prognostic variable. We apply our approach to the analysis of convective trajectories within the extratropical cyclone "Vladiana", which occurred between 22-25 September 2016 over the North Atlantic

    Visuelle Analyse großer Partikeldaten

    Get PDF
    Partikelsimulationen sind eine bewährte und weit verbreitete numerische Methode in der Forschung und Technik. Beispielsweise werden Partikelsimulationen zur Erforschung der Kraftstoffzerstäubung in Flugzeugturbinen eingesetzt. Auch die Entstehung des Universums wird durch die Simulation von dunkler Materiepartikeln untersucht. Die hierbei produzierten Datenmengen sind immens. So enthalten aktuelle Simulationen Billionen von Partikeln, die sich über die Zeit bewegen und miteinander interagieren. Die Visualisierung bietet ein großes Potenzial zur Exploration, Validation und Analyse wissenschaftlicher Datensätze sowie der zugrundeliegenden Modelle. Allerdings liegt der Fokus meist auf strukturierten Daten mit einer regulären Topologie. Im Gegensatz hierzu bewegen sich Partikel frei durch Raum und Zeit. Diese Betrachtungsweise ist aus der Physik als das lagrange Bezugssystem bekannt. Zwar können Partikel aus dem lagrangen in ein reguläres eulersches Bezugssystem, wie beispielsweise in ein uniformes Gitter, konvertiert werden. Dies ist bei einer großen Menge an Partikeln jedoch mit einem erheblichen Aufwand verbunden. Darüber hinaus führt diese Konversion meist zu einem Verlust der Präzision bei gleichzeitig erhöhtem Speicherverbrauch. Im Rahmen dieser Dissertation werde ich neue Visualisierungstechniken erforschen, welche speziell auf der lagrangen Sichtweise basieren. Diese ermöglichen eine effiziente und effektive visuelle Analyse großer Partikeldaten

    Visualizing Big Data with augmented and virtual reality: challenges and research agenda

    Get PDF
    This paper provides a multi-disciplinary overview of the research issues and achievements in the field of Big Data and its visualization techniques and tools. The main aim is to summarize challenges in visualization methods for existing Big Data, as well as to offer novel solutions for issues related to the current state of Big Data Visualization. This paper provides a classification of existing data types, analytical methods, visualization techniques and tools, with a particular emphasis placed on surveying the evolution of visualization methodology over the past years. Based on the results, we reveal disadvantages of existing visualization methods. Despite the technological development of the modern world, human involvement (interaction), judgment and logical thinking are necessary while working with Big Data. Therefore, the role of human perceptional limitations involving large amounts of information is evaluated. Based on the results, a non-traditional approach is proposed: we discuss how the capabilities of Augmented Reality and Virtual Reality could be applied to the field of Big Data Visualization. We discuss the promising utility of Mixed Reality technology integration with applications in Big Data Visualization. Placing the most essential data in the central area of the human visual field in Mixed Reality would allow one to obtain the presented information in a short period of time without significant data losses due to human perceptual issues. Furthermore, we discuss the impacts of new technologies, such as Virtual Reality displays and Augmented Reality helmets on the Big Data visualization as well as to the classification of the main challenges of integrating the technology.publishedVersionPeer reviewe
    corecore