320 research outputs found

    Unlocking Large Scale Uncertainty Quantification with In Transit Iterative Statistics

    Get PDF
    International audienceMulti-run numerical simulations using supercomputers are increasingly used by physicists and engineers for dealing with input data and model uncertainties. Most of the time, the input parameters of a simulation are modeled as random variables, then simulations are run a (possibly large) number of times with input parameters varied according to a specific design of experiments. Uncertainty quantification for numerical simulations is a hard computational problem, currently bounded by the large size of the produced results. This book chapter is about using in situ techniques to enable large scale uncertainty quantification studies. We provide a comprehensive description of Melissa, a file avoiding, adaptive, fault-tolerant, and elastic framework that computes in transit statistical quantities of interest. Melissa currently implements the on-the-fly computation of the statistics necessary for the realization of large scale uncertainty quantification studies: moment-based statistics (mean, standard deviation, higher orders), quantiles, Sobol' indices, and threshold exceedance

    Active Localization of Gas Leaks using Fluid Simulation

    Get PDF
    Sensors are routinely mounted on robots to acquire various forms of measurements in spatio-temporal fields. Locating features within these fields and reconstruction (mapping) of the dense fields can be challenging in resource-constrained situations, such as when trying to locate the source of a gas leak from a small number of measurements. In such cases, a model of the underlying complex dynamics can be exploited to discover informative paths within the field. We use a fluid simulator as a model, to guide inference for the location of a gas leak. We perform localization via minimization of the discrepancy between observed measurements and gas concentrations predicted by the simulator. Our method is able to account for dynamically varying parameters of wind flow (e.g., direction and strength), and its effects on the observed distribution of gas. We develop algorithms for off-line inference as well as for on-line path discovery via active sensing. We demonstrate the efficiency, accuracy and versatility of our algorithm using experiments with a physical robot conducted in outdoor environments. We deploy an unmanned air vehicle (UAV) mounted with a CO2 sensor to automatically seek out a gas cylinder emitting CO2 via a nozzle. We evaluate the accuracy of our algorithm by measuring the error in the inferred location of the nozzle, based on which we show that our proposed approach is competitive with respect to state of the art baselines.Comment: Accepted as a journal paper at IEEE Robotics and Automation Letters (RA-L

    Individual-based modeling and predictive simulation of fungal infection dynamics

    Get PDF
    The human-pathogenic fungus Aspergillus fumigatus causes life-threatening infections in immunocompromised patients and poses increasing challenges for the modern medicine. A. fumigatus is ubiquitously present and disseminates via small conidia over the air of the athmosphere. Each human inhales several hundreds to thousands of conidia every day. The small size of conidia allows them to pass into the alveoli of the lung, where primary infections with A. fumigatus are typically observed. In alveoli, the interaction between fungi and the innate immune system of the host takes place. This interaction is the core topic of this thesis and covered by mathematical modeling and computer simulations. Since in vivo laboratory studies of A. fumigatus infections under physiological conditions is hard to realize a modular software framework was developed and implemented, which allows for spatio-temporal agent-based modeling and simulation. A to-scale A. fumigatus infection model in a typical human alveolus was developed in order to simulate and analyze the infection scenario under physiological conditions. The process of conidial discovery by alveolar macrophages was modeled and simulated with different migration modes and different parameter configurations. It could be shown that chemotactic migration was required to find the pathogen before the onset of germination. A second model took advantage of evolutionary game theory on graphs. Here, the course of infection was modeled as a consecutive sequence of evolutionary games related to the complement system, alveolar macrophages and polymorphonuclear neutrophilic granulocytes. The results revealed a central immunoregulatory role of alveolar macrophages. In the case of high infectious doses it was found that the host required fully active phagocytes, but in particular a qualitative response of quantitatively sufficient polymorphonuclear neutrophilic granulocytes.Der human-pathogene Schimmelpilz Aspergillus fumigatus verursacht tödliche Infektionen und Erkrankungen vorrangig bei immunsupprimierten Patienten und stellt die moderne Medizin vor zunehmende Herausforderungen. A. fumigatus ist ubiquitär präsent und verbreitet sich über sehr kleine Konidien durch Luftströmungen in der Athmosphäre. Mehrere Hundert bis Tausende dieser Konidien werden täglich durch jeden Menschen eingeatmet. Die geringe Größe der infektiösen Konidien erlauben es dem Pilz bis in die Alveolen der Lunge des Wirtes vorzudringen,in denen eine Primärinfektionen mit A. fumigatus am häufigsten stattfindet. Die Alveolen sind der zentrale Schauplatz der Interaktion zwischen dem Pilz und dem angeborenen Immunsystem, welche Gegenstand dieser Arbeit ist. Diese Interaktion wird mit Hilfe von mathematischen Modellen und Computersimulationen nachgestellt und untersucht, da eine A. fumigatus Infektion im Nasslabor in vivo unter physiologischen Bedingungen nur sehr schwer realisiert werden kann. Als Grundlage für dieses Vorhaben wurde ein modulares Software-Paket entwickelt, welches agentenbasierte Modellierung und entsprechende Simulationen in Raum und Zeit ermöglicht. Ein maßstabsgetreues mathematisches Infektionsmodell in einer typischen menschlichen Alveole wurde entwickelt und die Suchstrategien von Alveolarmakrophagen unter der Berücksichtigung verschiedener Parameter wie Migrationsgeschwindigkeit, dem Vorhandensein von Chemokinen, dessen Diffusion und Chemotaxis untersucht. Es zeigte sich, dass Chemotaxis, notwendig ist, um die Konidie rechtzeitig finden zu können. In einem weiteren Modell, welches auf das Konzept evolutionärer Spieltheorie auf Graphen zurückgegriff, wurde der Infektionsverlauf als aufeinanderfolgende Serie evolutionärer Spiele mit dem Komplementsystem, Alveolarmakrophagen und Neutrophilen nachgestellt. Aus den Simulationsergebnissen konnte eine zentrale immunregulatorische Rolle von Alveolarmakrophagen entnommen werden

    Machine Learning and Its Application to Reacting Flows

    Get PDF
    This open access book introduces and explains machine learning (ML) algorithms and techniques developed for statistical inferences on a complex process or system and their applications to simulations of chemically reacting turbulent flows. These two fields, ML and turbulent combustion, have large body of work and knowledge on their own, and this book brings them together and explain the complexities and challenges involved in applying ML techniques to simulate and study reacting flows. This is important as to the world’s total primary energy supply (TPES), since more than 90% of this supply is through combustion technologies and the non-negligible effects of combustion on environment. Although alternative technologies based on renewable energies are coming up, their shares for the TPES is are less than 5% currently and one needs a complete paradigm shift to replace combustion sources. Whether this is practical or not is entirely a different question, and an answer to this question depends on the respondent. However, a pragmatic analysis suggests that the combustion share to TPES is likely to be more than 70% even by 2070. Hence, it will be prudent to take advantage of ML techniques to improve combustion sciences and technologies so that efficient and “greener” combustion systems that are friendlier to the environment can be designed. The book covers the current state of the art in these two topics and outlines the challenges involved, merits and drawbacks of using ML for turbulent combustion simulations including avenues which can be explored to overcome the challenges. The required mathematical equations and backgrounds are discussed with ample references for readers to find further detail if they wish. This book is unique since there is not any book with similar coverage of topics, ranging from big data analysis and machine learning algorithm to their applications for combustion science and system design for energy generation

    Adaptive Asynchronous Control and Consistency in Distributed Data Exploration Systems

    Get PDF
    Advances in machine learning and streaming systems provide a backbone to transform vast arrays of raw data into valuable information. Leveraging distributed execution, analysis engines can process this information effectively within an iterative data exploration workflow to solve problems at unprecedented rates. However, with increased input dimensionality, a desire to simultaneously share and isolate information, as well as overlapping and dependent tasks, this process is becoming increasingly difficult to maintain. User interaction derails exploratory progress due to manual oversight on lower level tasks such as tuning parameters, adjusting filters, and monitoring queries. We identify human-in-the-loop management of data generation and distributed analysis as an inhibiting problem precluding efficient online, iterative data exploration which causes delays in knowledge discovery and decision making. The flexible and scalable systems implementing the exploration workflow require semi-autonomous methods integrated as architectural support to reduce human involvement. We, thus, argue that an abstraction layer providing adaptive asynchronous control and consistency management over a series of individual tasks coordinated to achieve a global objective can significantly improve data exploration effectiveness and efficiency. This thesis introduces methodologies which autonomously coordinate distributed execution at a lower level in order to synchronize multiple efforts as part of a common goal. We demonstrate the impact on data exploration through serverless simulation ensemble management and multi-model machine learning by showing improved performance and reduced resource utilization enabling a more productive semi-autonomous exploration workflow. We focus on the specific genres of molecular dynamics and personalized healthcare, however, the contributions are applicable to a wide variety of domains

    Machine Learning and Its Application to Reacting Flows

    Get PDF
    This open access book introduces and explains machine learning (ML) algorithms and techniques developed for statistical inferences on a complex process or system and their applications to simulations of chemically reacting turbulent flows. These two fields, ML and turbulent combustion, have large body of work and knowledge on their own, and this book brings them together and explain the complexities and challenges involved in applying ML techniques to simulate and study reacting flows. This is important as to the world’s total primary energy supply (TPES), since more than 90% of this supply is through combustion technologies and the non-negligible effects of combustion on environment. Although alternative technologies based on renewable energies are coming up, their shares for the TPES is are less than 5% currently and one needs a complete paradigm shift to replace combustion sources. Whether this is practical or not is entirely a different question, and an answer to this question depends on the respondent. However, a pragmatic analysis suggests that the combustion share to TPES is likely to be more than 70% even by 2070. Hence, it will be prudent to take advantage of ML techniques to improve combustion sciences and technologies so that efficient and “greener” combustion systems that are friendlier to the environment can be designed. The book covers the current state of the art in these two topics and outlines the challenges involved, merits and drawbacks of using ML for turbulent combustion simulations including avenues which can be explored to overcome the challenges. The required mathematical equations and backgrounds are discussed with ample references for readers to find further detail if they wish. This book is unique since there is not any book with similar coverage of topics, ranging from big data analysis and machine learning algorithm to their applications for combustion science and system design for energy generation

    Automating Camera Placement for In Situ Visualization

    Get PDF
    Trends in high-performance computing increasingly require visualization to be carried out using in situ processing. This processing most often occurs without a human in the loop, meaning that the in situ software must be able to carry out its tasks without human guidance. This dissertation explores this topic, focusing on automating camera placement for in situ visualization when there is no a priori knowledge of where to place the camera. We introduce a new approach for this automation process, which depends on Viewpoint Quality (VQ) metrics that quantify how much insight a camera position provides. This research involves three major sub-projects: (1) performing a user survey to determine the viewpoint preferences of scientific users as well as developing new VQ metrics that can predict preference 68% of the time; (2) parallelizing VQ metrics and designing search algorithms so they can be executed efficiently in situ; and (3) evaluating the behavior of camera placement of time-varying data to determine how often a new camera placement should be considered. In all, this dissertation shows automating in situ camera placement for scientific simulations is possible on exascale computers and provides insight on best practices

    Volumetric Isosurface Rendering with Deep Learning-Based Super-Resolution

    Full text link
    Rendering an accurate image of an isosurface in a volumetric field typically requires large numbers of data samples. Reducing the number of required samples lies at the core of research in volume rendering. With the advent of deep learning networks, a number of architectures have been proposed recently to infer missing samples in multi-dimensional fields, for applications such as image super-resolution and scan completion. In this paper, we investigate the use of such architectures for learning the upscaling of a low-resolution sampling of an isosurface to a higher resolution, with high fidelity reconstruction of spatial detail and shading. We introduce a fully convolutional neural network, to learn a latent representation generating a smooth, edge-aware normal field and ambient occlusions from a low-resolution normal and depth field. By adding a frame-to-frame motion loss into the learning stage, the upscaling can consider temporal variations and achieves improved frame-to-frame coherence. We demonstrate the quality of the network for isosurfaces which were never seen during training, and discuss remote and in-situ visualization as well as focus+context visualization as potential application

    Lagrangian studies, circulation and mixing in the Southern Ocean

    Get PDF
    Oceans play a vital role as one of the major components of Earth's climate system. The study of oceanic processes and the complexity inherent in dynamic ows is essential for understanding their regulatory character on the climate's variability. A key region for the study of such intrinsic oceanic variability is the Southern Ocean. In the form of a wind-driven, zonally unbounded, strong eastward ow, the Antarctic Circumpolar Current (ACC) circumnavigates the Antarctic continent connecting each of the ocean basins. The dynamics of the ACC, which is characterised by the absence of land barriers, apart from when crossing Drake Passage, have long been a topic of debate [Rintoul et al., 2001]. The main interests of this study focus on inferring and mapping the dynamic variability the ACC exhibits by means of transient disturbances [Hughes, 2005] (such as mesoscale eddies) and subsequent mixing from Lagrangian trajectories. The distribution of eddy transport and intensity, the mixing of conservative quantities and ow dynamics through to the interaction of eddy kinetic energy, mean ow and topography are examined. The sparseness of observations in the Southern Ocean and the necessity to understand the role of the oceanic circulation in the climate by a holistic approach highlights computational ocean circulation models as indispensable. In the context of this study, output from the run401 of the Ocean Circulation and Climate Advance Model (OCCAM) 1/12� ocean model, developed at the U.K. National Oceanography Centre, is utilised. In order to deduce the temporal and spatial variability of the ow dynamics, as well as its vertical distribution, simulation of monthly releases of passive particles using di�erent schemes (i.e. cluster or linear alignment) on isobaric and isoneutral surfaces was conducted. An analysis of the Lagrangian trajectories reveals the characteristics of the dynamics that control the ow and depict regions of enhanced eddy activity and mixing. The model's ability to simulate real oceanic ows is established through comparison with a purposeful release of the tracer CF3SF5, which is conducted as part of the DIMES experiment (http://dimes.ucsd.edu/). We �nd that topography plays a fundamental role in the context of Southern Ocean mixing through the association of high EKE regions, where the interaction of vortical elements and multi �lamented jets in non-parallel ows supports an e�ective mechanism for eddy stirring, resulting in the enhanced dispersion of particles. Suppression of mixingin regions where the ow is delineated by intensi�ed and coherent, both in space and time, jets (strong PV gradients) signifying the separation of the ow in di�erentiated kinematic environments, is illustrated. The importance of a local approximation to mixing instead of the construction of zonal averages is presented. We present the caveats of classical di�usion theory in the presence of persistent structures and �nd that values of 1000-2000 m2
    corecore