11 research outputs found

    Logging in a computational steering environment

    Get PDF
    Logging of input and output variables is very useful in computational steering. In this paper we describe how we added logging functionality to a computational steering environment developed at CWI. We show how a 2D interface can be augmented with logging by using the third dimension for the display of the logged variables. The user specifies which graphical representations of variables must be logged, and this log is displayed together with the current state of the simulation. Two examples show that logging in computational steering gives more insight in the simulation, that it can be used for monitoring, and that it can be used to undo erroneous actions

    Computational steering in the CAVE

    Get PDF
    Scientists can gain much more insight from their simulations if they are enabled to change simulation parameters on the fly while observing the results immediately. A crucial aspect of such {em computational steering/ is an intuitive user interface. We have developed an environment that enables researchers to construct such interfaces efficiently and effectively for graphical workstations. In this paper we report on our next step towards more intuitive user-interfaces: We have modified our system for use in the CAVE. The CAVE is a projection-based virtual environment. Virtual environments are designed to provide the effect of immersion in an interactive three-dimensional computer-generated environment. We show that the use of virtual environments for computational steering interfaces can improve interaction with the simulation and immersion in the computational process. We present our system, the methods we have developed for improved 3D interaction, and describe three applications

    3D computational steering with parametrized geometric objects

    Get PDF
    Computational Steering is the ultimate goal of interactive simulation: researchers change parameters of their simulation and immediately receive feedback on the effect. We present a general and flexible graphics tool that is part of an environment for Computational Steering developed at CWI. It enables the researcher to interactively develop his own interface with the simulation. This interface is constructed with 3D Parametrized Geometric Objects. The properties of the objects are parametrized to output data and input parameters of the simulation. The objects visualize the output of the simulation while the researcher can steer the simulation by direct manipulation of the objects. Several applications of 3D Computational Steering are presented

    A survey of computational steering environments

    Get PDF
    Computational steering is a powerful concept that allows scientists to interactively control a computational process during its execution. In this paper, a survey of computational steering environments for the on-line steering of ongoing scientific and engineering simulations is presented. These environments can be used to create steerable applications for model exploration, algorithm experimentation, or performance optimization. For each environment the scope is identified, the architecture is summarized, and the concepts of the user interface is described. The environments are compared and conclusions and future research issues are given

    Análise multitemática de dados geológicos e sismica de reflexao : um ensaio metodológico - estudo de caso Grupo Itararé

    Get PDF
    Orientador: Sidnei Pires RostirollaCo-orientadores: Augustinho Rigoti e Ciro Jorge AppiDissertaçao (mestrado) - Universidade Federal do Paraná, Setor de Ciencias da Terra, Programa de Pós-Graduaçao em Geologia. Defesa: Curitiba, 2004Inclui bibliografiaÁrea de concentraçao: Geologia exploratóriaResumo: As áreas de estudo desta pesquisa envolveram rochas pertencentes ao Grupo Itararé - Bacia do Paraná, localizadas em afloramentos nas cidades de Ponta Grossa (próximo ao parque estadual de Vila Velha) e Lapa, ambas no estado do Paraná. O principal objetivo foi a descrição de uma seqüência metodológica que utilizou várias técnicas como: mapeamento geológico, aquisição e processamento de dados de sísmica de reflexão rasa, ensaio geoestatístico com as amplitudes sísmicas e a geração de volumes sísmicos. Foi necessária a adoção de duas áreas de estudo para contemplar as principais etapas do desenvolvimento metodológico proposto, com enfoque geológico na Lapa e geofísico em Vila Velha. Os principais litotipos mapeados na Lapa foram arenitos, diamictitos, conglomerados e folhelhos descritos no mapa geológico. A resposta geofísica deste local não atingiu os resultados esperados, foram encontrados poucos refletores, sendo necessária a aquisição de dados geofísicos em outra porção correlata, no caso a área de Ponta Grossa. A investigação geológica na Lapa gerou o conhecimento das principais variações litológicas na Serra do Monge, assim como o mapeamento destas. Foram definidas cinco unidades litológicas com base no litotipo predominante e posição estratigráfica. Os principais lineamentos estruturais foram correlacionados com grandes feições presentes em escala de bacia definidos por trabalhos anteriores. Também foram investigadas falhas e fraturas em escala de afloramento. Na região próxima ao Parque de Vila Velha foram realizados levantamentos sísmicos de reflexão visando alvos rasos, até aproximadamente 250 metros de profundidade. O resultado foi o imageamento do substrato, incluindo uma correlação com as associações litológicas e falhas visíveis em campo. Os ensaios sísmicos de calibração e parametrização realizados em ambas as áreas de estudo foram essenciais para a solução de diversos problemas, como a fixação de geofones em rocha, geometria dos arranjos e posicionamento da fonte sísmica. No processamento dos dados sísmicos foram executados exaustivos testes que resultaram em seções de boa qualidade na área de Ponta Grossa e ainda pouco refinadas na área da Lapa. Os principais processos numéricos que contribuíram para melhorar as seções sísmicas foram a deconvolução preditiva, filtragem de freqüências e o emudecimento das primeiras quebras. A técnica CDP (common depth point) foi utilizada através da análise de velocidade CVS (Constant velocity stack) e correção NMO (normal move out). Através da construção de semivariogramas, o ensaio geoestatístico demonstrou a provável dependência espacial das amplitudes sísmicas, sendo possível a transformação de dados 2D em uma informação volumétrica para uma interpretação tridimensional. O ensaio geoestatístico demonstrou que a técnica pode ser utilizada com dados sísmicos, como visualizado pelo patamar bem definido em alguns semivariogramas, principalmente nas buscas das amostras realizadas em azimutes paralelos à direção de levantamento das linhas sísmicas. Foi executado um teste na geração de um cubo sísmico preliminar visando à correlação deste com informações geológicas extraídas de fotografias aéreas.Abstract: The study areas of this research involve rocks of the Itararé Group - Paraná Basin, wich outcrop in the region of Ponta Grossa (near Vila Velha park) and Lapa cities, in Paraná state. The main objective was the description of a methodological sequence that uses various techniques like: geological mapping, shallow reflection seismic, a geostatistic assay with seismic amplitudes and seismic cube generation. Moreover to contemplate the main stages of the methodological development, it was necessary to adopt two study areas, with geological approach in Lapa and geophysical approach in Vila Velha. The main rock types mapped in Lapa area are sandstones, diamictites, conglomerates and shales, described in the geological map. The geophysical response of this area did not provide the expected results, being necessary the geophysical data acquisition in another correlated area, in this case the Ponta Grossa area. The geological investigation in the Lapa region generated the knowledge and mapping of the main lithologic variations in the Monge mountain range. In this mapping, five lithologic units are defined, based on the predominant rock type and its stratigraphic position. The main structural lineaments are correlated to large alignments present in basin scale, as defined by previous works. Faults and fractures at outcrop scale also were studied. The seismic reflection surveys carried out in Ponta Grossa region are processed and interpreted for shallow targets, down to approximately 250 meters depth. The result was the subsurface anisotropy imaging, including an earlier correlation with lithologic associations and faults observed in the field. Several seismic calibration and parameterization field tests were done in both areas to solve problems such as the placement of geophones in outcropping rock, array geometry and seismic source position. In the seismic data processing, exhaustive testing was done to achieve good quality seismic sections in Ponta Grossa area and still poorly refined in the Lapa area. The main numeric processes applied to seismic data, which contributed to improve the seismic section aspects, are the predictive deconvolution, frequency filters and mute of first breaks. The CDP (common depth point) technique was used by means of CVS (constant velocity stack) velocity analysis and NMO (normal move out) correction. Trough constructions of semivariograms, the geostatistic assay shows the probable spatial dependency of the seismic amplitudes, being possible the transformation of 2D data to volumetric information for 3D interpretation. The geoestatistic assay carried out, demonstrated that this technique is useful and can be used with seismic data, as observed in well defined sill in some semivariograms, specially in sample search done at directions parallel to the surveys. A preliminary seismic cube was executed aiming the correlation with geologic information like aerial photographs

    An integrated software approach to interactive exploration and steering of fluid flow simulations on many-core architectures

    Get PDF
    Traditionell werden numerische Strömungssimulationen in einer zyklischen Sequenz autonomer Teilschritte durchgeführt. Seitens Wissenschaftlern existiert jedoch schon lange der Wunsch nach mehr Interaktion mit laufenden Simulationen. Seit dem maßgeblichen Report der National Science Foundation im Jahre 1987 wurden daher neue Formen der wissenschaftlichen Visualisierung entwickelt, die sich grundlegend von den traditionellen Verfahren unterscheiden. Insbesondere hat der sogenannte Computational Steering-Ansatz reges Interesse bewirkt. Damals wie heute ist die Anwendung des Verfahrens jedoch eher die Ausnahme denn die Regel. Ursächlich dafür sind zu großen Teilen Komplexität und Restriktionen traditioneller Hochleistungssysteme. Im Rahmen dieser Arbeit wird daher als Alternative zu dem traditionellen Vorgehen die immense Leistungsfähigkeit moderner Grafikkartengenerationen für die Berechnungen herangezogen. Das sogenannte GPGPU-Computing eignet sich insbesondere für die Anwendung der Lattice-Boltzmann-Methode im Bereich numerischer Strömungssimulationen. Auf Grundlage des LBM-Verfahrens wird im Rahmen dieser Arbeit prototypisch eine interaktive Simulationsumgebung basierend auf dem Computational Steering-Paradigma entwickelt, das alle Prozesse zur Lösung von Strömungsproblemen innerhalb einer einzelnen Anwendung integriert. Durch die Konvergenz der hohen massiv parallelen Rechenleistung der GPUs und der Interaktionsfähigkeiten in einer einzelnen Anwendung kann eine erhebliche Steigerung der Anwendungsqualität erzielt werden. Dabei ist es durch Einsatz mehrerer GPUs möglich, dreidimensionale Strömungsprobleme mit praxisrelevanter Problemgröße zu berechnen und gleichzeitig eine interaktive Manipulation und Exploration des Strömungsgebiets zur Laufzeit zu ermöglichen. Dabei ist der erforderliche finanzielle Aufwand verglichen mit traditionellen massiv parallelen Verfahren verhältnismäßig gering.Traditionally, computational fluid dynamics is done in a cyclic sequence of independent steps. Howerver it is a long term wish of scientists and engineers to closely interact with their running simulations. Since the influential report of the US National Science Foundation in 1987 new forms of scientific visualization have evolved that are quite different from traditional post-processing. Especially the approach commonly referred to as computational steering has been the subject of widespread interest. Although it is a very powerful paradigm, the use of computational steering is still the exception rather than the rule. The reasons for this are more or less related to the complexity and restrictions of traditional HPC systems. As an alternative to the traditional massively parallel approach, in this thesis the parallel computational power of GPGPUs is used for general purpose applications. The so called GPGPU computing has gained large popularity in the CFD community, especially for its application to the lattice Boltzmann method. Using this technology this work demonstrates a single desktop application integrating a complete interactive CFD simulation environment for reasonable hardware costs. It shows that the convergence of massive parallel computational power and steering environment into a single system significantly improves the usability, application quality and user-friendliness. Using multiple GPUs, the efficiency of this approach allows for CFD simulations in three dimensional space evolving close to real-time even for reasonable grid sizes. Thereby, the simulation can be explored and also adjusted during runtime. The thesis also shows that the responsiveness significantly benefits from avoiding common bandwidth and latency bottlenecks inherent in traditional HPC approaches. Those can be avoided as GPGPU computing does not generally require network communication, which also reduces the complexity of the application

    the plenoptic sensor

    Get PDF
    In this thesis, we will introduce the innovative concept of a plenoptic sensor that can determine the phase and amplitude distortion in a coherent beam, for example a laser beam that has propagated through the turbulent atmosphere.. The plenoptic sensor can be applied to situations involving strong or deep atmospheric turbulence. This can improve free space optical communications by maintaining optical links more intelligently and efficiently. Also, in directed energy applications, the plenoptic sensor and its fast reconstruction algorithm can give instantaneous instructions to an adaptive optics (AO) system to create intelligent corrections in directing a beam through atmospheric turbulence. The hardware structure of the plenoptic sensor uses an objective lens and a microlens array (MLA) to form a mini “Keplerian” telescope array that shares the common objective lens. In principle, the objective lens helps to detect the phase gradient of the distorted laser beam and the microlens array (MLA) helps to retrieve the geometry of the distorted beam in various gradient segments. The software layer of the plenoptic sensor is developed based on different applications. Intuitively, since the device maximizes the observation of the light field in front of the sensor, different algorithms can be developed, such as detecting the atmospheric turbulence effects as well as retrieving undistorted images of distant objects. Efficient 3D simulations on atmospheric turbulence based on geometric optics have been established to help us perform optimization on system design and verify the correctness of our algorithms. A number of experimental platforms have been built to implement the plenoptic sensor in various application concepts and show its improvements when compared with traditional wavefront sensors. As a result, the plenoptic sensor brings a revolution to the study of atmospheric turbulence and generates new approaches to handle turbulence effect better

    Conception et mise en œuvre de multichronia, un cadre conceptuel de simulation visuelle interactive

    Get PDF
    Cette thèse présente Multichronia, un cadre conceptuel de simulation interactive fournissant une représentation visuelle du cheminement d'un utilisateur dans l'exploration des simulations d'un système complexe. En complément aux méthodes formelles d'analyse, Multichronia vise à aider ses utilisateurs à comprendre un système sous étude en fournissant quatre boucles interactives. La boucle d'exploration de l'espace des paramètres permet à un utilisateur de modifier des paramètres de simulation afin de tester des hypothèses. La boucle d'exploration de l'espace des simulations lui permet de manipuler les données correspondant à des instances de simulation. Notamment, elle rend disponible des opérations de sélection et d'alignement via une interface graphique. La boucle d'exploration de l'espace des données lui permet de transformer les flots de données. Finalement, la boucle d'exploration de l'espace visuel lui permet d'afficher des données et de manipuler leur aspect visuel. Afin de représenter le cheminement d'un utilisateur dans son exploration de l'espace des paramètres, une interface graphique a été développée. Il s'agit de Varbre multichro-nique, une vue formelle donnant une représentation informative de l'état de l'analyse d'un problème ainsi que la possibilité d'exécuter une foule d'opérations interactives. D'autre part, le cadre conceptuel Multichronia forme un pipeline de données générique allant d'un simulateur jusqu'à un logiciel d'analyse. Un modèle conceptuel peut être extrait de ce pipeline de même que le flux de données correspondant. Dans cette thèse, il a été spécialisé avec la technologie XML. Cette dernière permet entre autres de définir une méthodologie de conception du modèle de données associé à un simulateur. La mise en oeuvre de Multichronia a permis de vérifier la validité des concepts proposés. L'architecture logicielle adoptée est un cadre d'application, de sorte que de nouveaux simulateurs puissent être facilement exploités. Deux applications concrètes ont été implantées, soit la simulation tactique et stratégique de l'attaque de convois militaires. Des modifications mineures aux simulateurs ont été nécessaires afin qu'ils rencontrent certains critères établis dans cette thèse. Somme toute, ces applications ont montré que Multichronia peut être déployé pour des applications quelconques
    corecore