131 research outputs found

    Visual Analysis of Popping in Progressive Visualization

    Get PDF
    Progressive visualization allows users to examine intermediate results while they are further refined in the background. This makes them increasingly popular when dealing with large data and computationally expensive tasks. The characteristics of how preliminary visualizations evolve over time are crucial for efficient analysis; in particular unexpected disruptive changes betweeniterations can significantly hamper the user experience. This paper proposes a visualization framework to analyze the refinement behavior of progressive visualization. We particularly focus on sudden significant changes between the iterations, which we denote as popping artifacts, in reference to undesirable visual effects in the context of level of detail representations in computergraphics. Our visualization approach conveys where in image space and when during the refinement popping artifacts occur. It allows to compare across different runs of stochastic processes, and supports parameter studies for gaining further insights and tuning the algorithms under consideration. We demonstrate the application of our framework and its effectiveness via twodiverse use cases with underlying stochastic processes: adaptive image space sampling, and the generation of grid layouts

    A Review and Characterization of Progressive Visual Analytics

    Get PDF
    Progressive Visual Analytics (PVA) has gained increasing attention over the past years. It brings the user into the loop during otherwise long-running and non-transparent computations by producing intermediate partial results. These partial results can be shown to the user for early and continuous interaction with the emerging end result even while it is still being computed. Yet as clear-cut as this fundamental idea seems, the existing body of literature puts forth various interpretations and instantiations that have created a research domain of competing terms, various definitions, as well as long lists of practical requirements and design guidelines spread across different scientific communities. This makes it more and more difficult to get a succinct understanding of PVA’s principal concepts, let alone an overview of this increasingly diverging field. The review and discussion of PVA presented in this paper address these issues and provide (1) a literature collection on this topic, (2) a conceptual characterization of PVA, as well as (3) a consolidated set of practical recommendations for implementing and using PVA-based visual analytics solutions

    Prioritized Data Compression using Wavelets

    Full text link
    The volume of data and the velocity with which it is being generated by com- putational experiments on high performance computing (HPC) systems is quickly outpacing our ability to effectively store this information in its full fidelity. There- fore, it is critically important to identify and study compression methodologies that retain as much information as possible, particularly in the most salient regions of the simulation space. In this paper, we cast this in terms of a general decision-theoretic problem and discuss a wavelet-based compression strategy for its solution. We pro- vide a heuristic argument as justification and illustrate our methodology on several examples. Finally, we will discuss how our proposed methodology may be utilized in an HPC environment on large-scale computational experiments

    The Mathematical Facts Of Games Of Chance Between Exposure, Teaching, And Contribution To Cognitive Therapies: Principles Of An Optimal Mathematical Intervention For Responsible Gambling

    Get PDF
    On the question of whether gambling behavior can be changed as result of teaching gamblers the mathematics of gambling, past studies have yielded contradictory results, and a clear conclusion has not yet been drawn. In this paper, I bring some criticisms to the empirical studies that tended to answer no to this hypothesis, regarding the sampling and laboratory testing, and I argue that an optimal mathematical scholastic intervention with the objective of preventing problem gambling is possible, by providing the principles that would optimize the structure and content of the teaching module. Given the ethical aspects of the exposure of mathematical facts behind games of chance, and starting from the slots case – where the parametric design is missing, we have to draw a line between ethical and optional information with respect to the mathematical content provided by a scholastic intervention. Arguing for the role of mathematics in problem-gambling prevention and treatment, interdisciplinary research directions are drawn toward implementing an optimal mathematical module in cognitive therapies

    The Application of the Montage Image Mosaic Engine To The Visualization Of Astronomical Images

    Get PDF
    The Montage Image Mosaic Engine was designed as a scalable toolkit, written in C for performance and portability across *nix platforms, that assembles FITS images into mosaics. The code is freely available and has been widely used in the astronomy and IT communities for research, product generation and for developing next-generation cyber-infrastructure. Recently, it has begun to finding applicability in the field of visualization. This has come about because the toolkit design allows easy integration into scalable systems that process data for subsequent visualization in a browser or client. And it includes a visualization tool suitable for automation and for integration into Python: mViewer creates, with a single command, complex multi-color images overlaid with coordinate displays, labels, and observation footprints, and includes an adaptive image histogram equalization method that preserves the structure of a stretched image over its dynamic range. The Montage toolkit contains functionality originally developed to support the creation and management of mosaics but which also offers value to visualization: a background rectification algorithm that reveals the faint structure in an image; and tools for creating cutout and down-sampled versions of large images. Version 5 of Montage offers support for visualizing data written in HEALPix sky-tessellation scheme, and functionality for processing and organizing images to comply with the TOAST sky-tessellation scheme required for consumption by the World Wide Telescope (WWT). Four online tutorials enable readers to reproduce and extend all the visualizations presented in this paper.Comment: 16 pages, 9 figures; accepted for publication in the PASP Special Focus Issue: Techniques and Methods for Astrophysical Data Visualizatio

    Methods and design issues for next generation network-aware applications

    Get PDF
    Networks are becoming an essential component of modern cyberinfrastructure and this work describes methods of designing distributed applications for high-speed networks to improve application scalability, performance and capabilities. As the amount of data generated by scientific applications continues to grow, to be able to handle and process it, applications should be designed to use parallel, distributed resources and high-speed networks. For scalable application design developers should move away from the current component-based approach and implement instead an integrated, non-layered architecture where applications can use specialized low-level interfaces. The main focus of this research is on interactive, collaborative visualization of large datasets. This work describes how a visualization application can be improved through using distributed resources and high-speed network links to interactively visualize tens of gigabytes of data and handle terabyte datasets while maintaining high quality. The application supports interactive frame rates, high resolution, collaborative visualization and sustains remote I/O bandwidths of several Gbps (up to 30 times faster than local I/O). Motivated by the distributed visualization application, this work also researches remote data access systems. Because wide-area networks may have a high latency, the remote I/O system uses an architecture that effectively hides latency. Five remote data access architectures are analyzed and the results show that an architecture that combines bulk and pipeline processing is the best solution for high-throughput remote data access. The resulting system, also supporting high-speed transport protocols and configurable remote operations, is up to 400 times faster than a comparable existing remote data access system. Transport protocols are compared to understand which protocol can best utilize high-speed network connections, concluding that a rate-based protocol is the best solution, being 8 times faster than standard TCP. An HD-based remote teaching application experiment is conducted, illustrating the potential of network-aware applications in a production environment. Future research areas are presented, with emphasis on network-aware optimization, execution and deployment scenarios
    • …
    corecore