13 research outputs found

    New Visualization Techniques for Multi-Dimensional Variables in Complex Physical Domains

    Get PDF
    This work presents the new Synthesized Cell Texture (SCT) algorithm for visualizing related multiple scalar value fields within the same 3D space. The SCT method is particularly well suited to scalar quantities that could be represented in the physical domain as size fractionated particles, such as in the study of sedimentation, atmospheric aerosols, or precipitation. There are two components to this contribution. First a Scaling and Distribution (SAD) algorithm provides a means of specifying a multi-scalar field in terms of a maximum cell resolution (or density of represented values). This information is used to scale the multi-scalar field values for each 3D cell to the maximum values found throughout the data set, and then randomly distributes those values as particles varying in number, size, color, and opacity within a 2D cell slice. This approach facilitates viewing of closely spaced layers commonly found in sigma-coordinate grids. The SAD algorithm can be applied regardless of how the particles are rendered. The second contribution provides the Synthesized Cell Texture (SCT) algorithm to render the multi-scalar values. In this approach, a texture is synthesized from the location information computed by the SAD algorithm, which is then applied to each cell as a 2D slice within the volume. The SCT method trades off computation time (to synthesize the texture) and texture memory against the number of geometric primitives that must be sent through the graphics pipeline of the host system. Analysis results from a user study prove the effectiveness of the algorithm as a browsing method for multiple related scalar fields. The interactive rendering performance of the SCT method is compared with two common basic particle representations: flat-shaded color-mapped OpenGL points and quadrilaterals. Frame rate statistics show the SCT method to be up to 44 times faster, depending on the volume to be displayed and the host system. The SCT method has been successfully applied to oceanographic sedimentation data, and can be applied to other problem domains as well. Future enhancements include the extension to time-varying data and parallelization of the texture synthesis component to reduce startup time

    Visualization Support for Cognitive Sciences

    Get PDF
    The science of computer graphics and visualization is intertwined in many ways with Cognitive Sciences. On the one hand, computer graphics can lead to virtual environments in which a person is exposed to a virtual scenario. Typically, 3D-capable display technology combined with tracking systems, which are capable of identifying where the person is located at, are deployed to achieve maximal immersion in that the persons point of view is recreated in the virtual scenario. As a result, an impressive experience is created such that that person is navigating the virtual scenario as if it was real. On the other hand, visualization techniques can be utilized to present the results from a cognitive science experiment to the user such that it provides easier access to the data. This could range from simple plots to more sophisiticated approaches, such as parallel coordinates. In addition, results from cognitive sciences can feed back into the visualization to make the visualization more user-friendly. For example, more intuitive input devices, such as cyber gloves which track the position of a users fingers, could be used to intuitively make selections or view modifications. The Appenzeller Visualization Laboratory is in a perfect position to enable research in all of these areas mentioned above. Sophisticated display systems are available which provide full immersion, ranging from single screens and head-mounted displays to full-size CAVE-type displays. This presentation will illustrate some examples for visualizations of data from the cognitive science realm and showcase display systems and some of their use cases

    Parameter Space Visualization for Large-scale Datasets Using Parallel Coordinate Plots

    No full text
    Visualization is an important task in data analytics, as it allows researchers to view patterns within the data instead of reading through extensive raw data. Allowing the ability to interact with the visualizations is an essential aspect, since it provides the ability to intuitively explore data to find meaning and patterns more efficiently. Interactivity, however, becomes progressively more difficult as the size of the dataset increases. This project begins by leveraging existing web-based data visualization technologies, and extends their functionality through the use of parallel processing. This methodology utilizes state-of-the-art techniques, such as Node.js, to split the visualization rendering and user interactivity controls between a client–server infrastructure without having to rebuild the visualization technologies. The approach minimizes data transfer by performing the rendering step on the server while allowing for the use of high-performance computing systems to render the visualizations more quickly. In order to improve the scaling of the system with larger datasets, parallel processing and visualization optimization techniques are used. This work uses parameter space data generated from mindmodeling.org to showcase the authors’ methodology for handling large-scale datasets while retaining interactivity and user friendliness

    Visualizing Confusion Matrices for Multidimensional Signal Detection Correlational Methods

    No full text
    Advances in modeling and simulation for General Recognition Theory have produced more data than can be easily visualized using traditional techniques. In this area of psychological modeling, domain experts are struggling to find effective ways to compare large-scale simulation results. This paper describes methods that adapt the web-based D3 visualization framework combined with pre-processing tools to enable domain specialists to more easily interpret their data. The D3 framework utilizes Javascript and scalable vector graphics (SVG) to generate visualizations that can run readily within the web browser for domain specialists. Parallel coordinate plots and heat maps were developed for identification-confusion matrix data, and the results were shown to a GRT expert for an informal evaluation of their utility. There is a clear benefit to model interpretation from these visualizations when researchers need to interpret larger amounts of simulated data

    Parameter Space Visualization for Large-scale Datasets Using Parallel Coordinate Plots

    No full text
    Visualization is an important task in data analytics, as it allows researchers to view patterns within the data instead of reading through extensive raw data. Allowing the ability to interact with the visualizations is an essential aspect, since it provides the ability to intuitively explore data to find meaning and patterns more efficiently. Interactivity, however, becomes progressively more difficult as the size of the dataset increases. This project begins by leveraging existing web-based data visualization technologies, and extends their functionality through the use of parallel processing. This methodology utilizes state-of-the-art techniques, such as Node.js, to split the visualization rendering and user interactivity controls between a client–server infrastructure without having to rebuild the visualization technologies. The approach minimizes data transfer by performing the rendering step on the server while allowing for the use of high-performance computing systems to render the visualizations more quickly. In order to improve the scaling of the system with larger datasets, parallel processing and visualization optimization techniques are used. This work uses parameter space data generated from mindmodeling.org to showcase the authors’ methodology for handling large-scale datasets while retaining interactivity and user friendliness

    Interactive Visualization of GRT and BioHTS Data

    Get PDF
    The scope of this project is to provide better tools for statistical and informational visual analysis for High Throughput Screening of Biological Infectious Agents (BioHTS), General Recognition Theory (GRT) modeling, and areas where pipelines of unstructured datasets of all types must be analyzed. A parallel coordinates plot is one of the more effective visualization methods for visualizing multi variant data

    HPC Enabled Data Analytics for High-Throughput High-Content Cellular Analysis

    Get PDF
    Biologists doing high-throughput high-content cellular analysis are generally not computer scientists or high performance computing (HPC) experts, and they want their workflow to support their science without having to be. We describe a new HPC enabled data analytics workflow with a web interface, HPC pipeline for analysis, and both traditional and new analytics tools to help them transition from a single workstation mode of operation to power HPC users. This allows the processing of multiple plates over a short period of time to ensure timely query and analysis to match potential countermeasures to individual responses

    HPC Enabled Data Analytics for High-Throughput High-Content Cellular Analysis

    No full text
    Biologists doing high-throughput high-content cellular analysis are generally not computer scientists or high performance computing (HPC) experts, and they want their workflow to support their science without having to be. We describe a new HPC enabled data analytics workflow with a web interface, HPC pipeline for analysis, and both traditional and new analytics tools to help them transition from a single workstation mode of operation to power HPC users. This allows the processing of multiple plates over a short period of time to ensure timely query and analysis to match potential countermeasures to individual responses
    corecore