19 research outputs found

    Epistemic and Aleatoric Uncertainty Quantification and Surrogate Modelling in High-Performance Multiscale Plasma Physics Simulations

    Full text link
    This work suggests several methods of uncertainty treatment in multiscale modelling and describes their application to a system of coupled turbulent transport simulations of a tokamak plasma. We propose a method to quantify the usually aleatoric uncertainty of a system in a quasi-stationary state, estimating the mean values and their errors for quantities of interest, which is average heat fluxes in the case of turbulence simulations. The method defines the stationarity of the system and suggests a way to balance the computational cost of simulation and the accuracy of estimation. This allows, contrary to many approaches, to incorporate aleatoric uncertainties in the analysis of the model and to have a quantifiable decision for simulation runtime. Furthermore, the paper describes methods for quantifying the epistemic uncertainty of a model and the results of such a procedure for turbulence simulations, identifying the model's sensitivity to particular input parameters and sensitivity to uncertainties in total. Finally, we introduce a surrogate model approach based on Gaussian Process Regression and present a preliminary result of training and analysing the performance of such a model based on turbulence simulation data. Such an approach shows a potential to significantly decrease the computational cost of the uncertainty propagation for the given model, making it feasible on current HPC systems

    Optimizing Nuclear Reaction Analysis (NRA) using Bayesian Experimental Design

    Full text link
    Nuclear Reaction Analysis with 3{}^{3}He holds the promise to measure Deuterium depth profiles up to large depths. However, the extraction of the depth profile from the measured data is an ill-posed inversion problem. Here we demonstrate how Bayesian Experimental Design can be used to optimize the number of measurements as well as the measurement energies to maximize the information gain. Comparison of the inversion properties of the optimized design with standard settings reveals huge possible gains. Application of the posterior sampling method allows to optimize the experimental settings interactively during the measurement process.Comment: Bayesian Inference and Maximum Entropy Conference 2008, AIP Conference proceedings 1073, p. 348-358, 4 figure

    MaxEnt 2019—Proceedings, 2019, MaxEnt 2019The 39th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering

    No full text
    This Proceedings book presents papers from the 39th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2019. The workshop took place at the Max Planck Institute for Plasma Physics in Garching near Munich, Germany, from 30 June to 5 July 2019, and invited contributions on all aspects of probabilistic inference, including novel techniques, applications, and work that sheds new light on the foundations of inference. Addressed are inverse and uncertainty quantification (UQ) and problems arising from a large variety of applications, such as earth science, astrophysics, material and plasma science, imaging in geophysics and medicine, nondestructive testing, density estimation, remote sensing, Gaussian process (GP) regression, optimal experimental design, data assimilation, and data mining

    Outlier-Robust Surrogate Modeling of Ion–Solid Interaction Simulations

    No full text
    Data for complex plasma–wall interactions require long-running and expensive computer simulations. Furthermore, the number of input parameters is large, which results in low coverage of the (physical) parameter space. Unpredictable occasions of outliers create a need to conduct the exploration of this multi-dimensional space using robust analysis tools. We restate the Gaussian process (GP) method as a Bayesian adaptive exploration method for establishing surrogate surfaces in the variables of interest. On this basis, we expand the analysis by the Student-t process (TP) method in order to improve the robustness of the result with respect to outliers. The most obvious difference between both methods shows up in the marginal likelihood for the hyperparameters of the covariance function, where the TP method features a broader marginal probability distribution in the presence of outliers. Eventually, we provide first investigations, with a mixture likelihood of two Gaussians within a Gaussian process ansatz for describing either outlier or non-outlier behavior. The parameters of the two Gaussians are set such that the mixture likelihood resembles the shape of a Student-t likelihood

    Sequential Batch Design for Gaussian Processes Employing Marginalization †

    No full text
    Within the Bayesian framework, we utilize Gaussian processes for parametric studies of long running computer codes. Since the simulations are expensive, it is necessary to exploit the computational budget in the best possible manner. Employing the sum over variances —being indicators for the quality of the fit—as the utility function, we establish an optimized and automated sequential parameter selection procedure. However, it is also often desirable to utilize the parallel running capabilities of present computer technology and abandon the sequential parameter selection for a faster overall turn-around time (wall-clock time). This paper proposes to achieve this by marginalizing over the expected outcomes at optimized test points in order to set up a pool of starting values for batch execution. For a one-dimensional test case, the numerical results are validated with the analytical solution. Eventually, a systematic convergence study demonstrates the advantage of the optimized approach over randomly chosen parameter settings

    Radiometric Scale Transfer Using Bayesian Model Selection

    No full text
    The key input quantity to climate modelling and weather forecasts is the solar beam irradiance, i.e., the primary amount of energy provided by the sun. Despite its importance the absolute accuracy of the measurements are limited—which not only affects the modelling but also ground truth tests of satellite observations. Here we focus on the problem of improving instrument calibration based on dedicated measurements. A Bayesian approach reveals that the standard approach results in inferior results. An alternative approach method based on monomial based selection of regression functions, combined with model selection is shown to yield superior estimations for a wide range of conditions. The approach is illustrated on selected data and possible further enhancements are outlined

    Measuring deuterium permeation through tungsten near room temperature under plasma loading using a getter layer and ion-beam based detection

    No full text
    A method to measure deuterium permeation through tungsten near room temperature under plasma loading is presented. The permeating deuterium is accumulated in a getter layer of zirconium, titanium or erbium, respectively, on the unexposed side of the sample. Subsequently, the amount of deuterium in the getter is measured ex-situ using nuclear reaction analysis. A cover layer system on the getter prevents direct loading of the getter with deuterium from the gas phase during plasma loading. In addition, it enables the distinction of deuterium in the getter and at the cover surface. The method appears promising to add additional permeation measurement capabilities to deuterium retention experiments, also in other plasma devices, without the need for a complex in-situ permeation measurement setup. Keywords: Deuterium, Plasma, Permeation, Tungsten, Getter, Ion-bea

    A Bayesian Approach to the Estimation of Parameters and Their Interdependencies in Environmental Modeling

    No full text
    We present a case study for Bayesian analysis and proper representation of distributions and dependence among parameters when calibrating process-oriented environmental models. A simple water quality model for the Elbe River (Germany) is referred to as an example, but the approach is applicable to a wide range of environmental models with time-series output. Model parameters are estimated by Bayesian inference via Markov Chain Monte Carlo (MCMC) sampling. While the best-fit solution matches usual least-squares model calibration (with a penalty term for excessive parameter values), the Bayesian approach has the advantage of yielding a joint probability distribution for parameters. This posterior distribution encompasses all possible parameter combinations that produce a simulation output that fits observed data within measurement and modeling uncertainty. Bayesian inference further permits the introduction of prior knowledge, e.g., positivity of certain parameters. The estimated distribution shows to which extent model parameters are controlled by observations through the process of inference, highlighting issues that cannot be settled unless more information becomes available. An interactive interface enables tracking for how ranges of parameter values that are consistent with observations change during the process of a step-by-step assignment of fixed parameter values. Based on an initial analysis of the posterior via an undirected Gaussian graphical model, a directed Bayesian network (BN) is constructed. The BN transparently conveys information on the interdependence of parameters after calibration. Finally, a strategy to reduce the number of expensive model runs in MCMC sampling for the presented purpose is introduced based on a newly developed variant of delayed acceptance sampling with a Gaussian process surrogate and linear dimensionality reduction to support function-valued outputs
    corecore