6 research outputs found
Diva: A Declarative and Reactive Language for In-Situ Visualization
The use of adaptive workflow management for in situ visualization and
analysis has been a growing trend in large-scale scientific simulations.
However, coordinating adaptive workflows with traditional procedural
programming languages can be difficult because system flow is determined by
unpredictable scientific phenomena, which often appear in an unknown order and
can evade event handling. This makes the implementation of adaptive workflows
tedious and error-prone. Recently, reactive and declarative programming
paradigms have been recognized as well-suited solutions to similar problems in
other domains. However, there is a dearth of research on adapting these
approaches to in situ visualization and analysis. With this paper, we present a
language design and runtime system for developing adaptive systems through a
declarative and reactive programming paradigm. We illustrate how an adaptive
workflow programming system is implemented using our approach and demonstrate
it with a use case from a combustion simulation.Comment: 11 pages, 5 figures, 6 listings, 1 table, to be published in LDAV
2020. The article has gone through 2 major revisions: Emphasized
contributions, features and examples. Addressed connections between DIVA and
FRP. In sec. 3, we fixed a design flaw and addressed it in sec. 3.3-3.4.
Re-designed sec. 5 with a more concrete example and benchmark results.
Simplified the syntax of DIV
Análise de imagens tomográficas: visualização e paralelização de processamento
Dissertação de Mestrado em Engenharia InformáticaA micro-tomografia de raios-X por radiação do sincrotrão é uma técnica bem desenvolvida no domínio da medicina, e mais recentemente, foi adoptada em outras áreas, nomeadamente na Engenharia de Materiais. É uma técnica não destrutiva, que permite analisar a estrutura interna de componentes. O objecto em estudo é alvo de um feixe de radiação por toda a superfície que penetra no material, e um conjunto de detectores vai registando a intensidade dos raios à medida que o objecto vai rodando. Este procedimento origina um ficheiro de dados, que pode ter uma dimensão da ordem dos gigabytes e que necessita de ser processado para visualização da estrutura interna. Atendendo ao grande volume de dados e à complexidade de alguns algoritmos de processamento, certos tipos de processamentos podem mesmo levar dias. O programa do cientista francês Gerard Vignoles, Tritom, processando os dados tomográficos sequencialmente, demorava muito tempo em algumas operações. Num esforço anterior, Paulo Quaresma optimizou e paralelizou algumas das operações mais demoradas, usando um agregado (clusters) de computadores e programação baseada em troca de mensagens. Nesta tese, paralelizaram-se as operações mais pesadas do Tritom utilizando o modelo de memória partilhada, concretizado através da ferramenta OpenMP. Esta aceleração de obtenção de resultados é vantajosa para a investigação sobre os materiais e permite tirar partido da introdução de multi-processadores nas arquitecturas de computadores pessoais comuns. Foram realizados testes para análise das melhorias de tempo de execução com este método. Nesta tese, também se integrou também o Tritom num ambiente de visualização de dados gráfico e interactivo de nome OpenDX que facilita muito a utilização do programa aos menos experientes. O utilizador pode escolher os processamentos que deseja realizar sob a forma de módulos e pela ordem que quiser tudo num ambiente gráfico. Permite também a visualização tridimensional de dados que se torna vital para percepcionar alguns fenómenos nos objectos de estudo. Foram também criados alguns novos módulos a pedido dos investigadores de Engenharia de Materiais. O Tritom, assim paralelizado, oferecerá aos cientistas da área de materiais uma boa ferramenta de análise de imagens tomográficas com interacção simples e intuitiva. Poderão, dispor de uma mais-valia para leituras rápidas dos seus objectos de estudo sem recorrerem a clusters ou configurações de computadores complexas e pouco acessíveis
Recommended from our members
A Programmable Streaming Framework for Extreme-Scale Scientific Visualizations
Emerging computational and acquisition technologies are empowering scientists to conduct simulations and experiments on an unprecedented scale. These advancements can push the frontiers of science and technology with groundbreaking discoveries. However, they also pose significant challenges to traditional scientific visualization workflows. Firstly, the data generated by modern scientific studies using these technologies tends to be extremely large and complex, often resulting in slow processing and rendering times. This demands the development of visualization algorithms that can effectively scale with the size of the data. Secondly, state-of-the-art simulations and experiments produce data at extraordinary rates, complicating the task of generating valuable visualization results for scientists. Therefore, there's a pressing need for more adaptive and intelligent visualization workflows. Lastly, although new computer hardware and architecture can speed up the visualization process, significant performance variations still exist among visualization algorithms due to differing design choices. As a result, optimizing algorithms to better leverage emerging hardware features for enhanced efficiency remains an ongoing necessity.This dissertation addresses the aforementioned challenges by introducing a programmable streaming framework enhanced with implicit neural representation, designed for visualizing extreme-scale scientific data. Specifically, it unfolds three innovative methodologies:Firstly, the framework offers a reactive and declarative programming language for streamlining image generation, layout and interaction creation, and I/O processes, eliminating the need for users to manually control all visualization parameters and procedures. This language enables scientists to define highly adaptive visualization workflows through high-level, rule-based grammars. The system then automatically optimizes the low-level implementation according to these specifications, facilitating the creation of more efficient visualization workflows with simpler coding.Secondly, the framework features a scalable, hardware-accelerated streaming visualization system that allows visualization processes to run concurrently with I/O operations. This system not only achieves state-of-the-art scalability but can also effectively manages complex, multi-resolution data structures. It delivers accurate rendering outcomes, reduces memory usage, and leverages emerging hardware capabilities more efficiently.Finally, the framework integrates implicit neural representation (INR) techniques for data compression and interactive visualization. The use of INRs significantly reduces data size while preserving high-frequency details. Additionally, it enables direct access to spatial locations at any desired resolution, obviating the need for decompression or interpolation.In summary, this dissertation research addresses long-standing challenges inherent in extreme-scale scientific visualization by introducing novel designs and methodologies. The presented framework not only enables more efficient and adaptive visualization workflows but also leverages the latest hardware acceleration and data compression techniques. The implications of these advancements extend beyond mere technical improvements; they pave the way for deeper insights and discoveries across a broad spectrum of scientific studies. This research, therefore, represents a significant leap forward, with the potential to transform the landscape of scientific visualization
Visualization Techniques in Space and Atmospheric Sciences
Unprecedented volumes of data will be generated by research programs that investigate the Earth as a system and the origin of the universe, which will in turn require analysis and interpretation that will lead to meaningful scientific insight. Providing a widely distributed research community with the ability to access, manipulate, analyze, and visualize these complex, multidimensional data sets depends on a wide range of computer science and technology topics. Data storage and compression, data base management, computational methods and algorithms, artificial intelligence, telecommunications, and high-resolution display are just a few of the topics addressed. A unifying theme throughout the papers with regards to advanced data handling and visualization is the need for interactivity, speed, user-friendliness, and extensibility