1,007 research outputs found

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem

    Investigation of robust optimization and evidence theory with stochastic expansions for aerospace applications under mixed uncertainty

    Get PDF
    One of the primary objectives of this research is to develop a method to model and propagate mixed (aleatory and epistemic) uncertainty in aerospace simulations using DSTE. In order to avoid excessive computational cost associated with large scale applications and the evaluation of Dempster Shafer structures, stochastic expansions are implemented for efficient UQ. The mixed UQ with DSTE approach was demonstrated on an analytical example and high fidelity computational fluid dynamics (CFD) study of transonic flow over a RAE 2822 airfoil. Another objective is to devise a DSTE based performance assessment framework through the use of quantification of margins and uncertainties. Efficient uncertainty propagation in system design performance metrics and performance boundaries is achieved through the use of stochastic expansions. The technique is demonstrated on: (1) a model problem with non-linear analytical functions representing the outputs and performance boundaries of two coupled systems and (2) a multi-disciplinary analysis of a supersonic civil transport. Finally, the stochastic expansions are applied to aerodynamic shape optimization under uncertainty. A robust optimization algorithm is presented for computationally efficient airfoil design under mixed uncertainty using a multi-fidelity approach. This algorithm exploits stochastic expansions to create surrogate models utilized in the optimization process. To reduce the computational cost, output space mapping technique is implemented to replace the high-fidelity CFD model by a suitably corrected low-fidelity one. The proposed algorithm is demonstrated on the robust optimization of NACA 4-digit airfoils under mixed uncertainties in transonic flow. --Abstract, page iii

    How to effectively compute the reliability of a thermal-hydraulic nuclear passive system

    No full text
    International audienceThe computation of the reliability of a thermal-hydraulic (T-H) passive system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. The objective of this work is to provide operative guidelines to effectively handle the computation of the reliability of a nuclear passive system. Two directions of computation efficiency are considered: from one side, efficient Monte Carlo Simulation (MCS) techniques are indicated as a means to performing robust estimations with a limited number of samples: in particular, the Subset Simulation (SS) and Line Sampling (LS) methods are identified as most valuable; from the other side, fast-running, surrogate regression models (also called response surfaces or meta-models) are indicated as a valid replacement of the long-running T-H model codes: in particular, the use of bootstrapped Artificial Neural Networks (ANNs) is shown to have interesting potentials, including for uncertainty propagation.The recommendations drawn are supported by the results obtained in an illustrative application of literature

    Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    Get PDF
    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making

    Calibration Probe Uncertainty and Validation for the Hypersonic Material Environmental Test System

    Get PDF
    This paper presents an uncertainty analysis of the stagnation-point calibration probe surface predictions for conditions that span the performance envelope of the Hypersonic Materials Environmental Test System facility located at NASA Langley Research Center. A second-order stochastic expansion was constructed over 47 uncertain parameters to evaluate the sensitivities, identify the most significant uncertain variables, and quantify the uncertainty in the stagnation-point heat flux and pressure predictions of the calibration probe for a low- and high-enthalpy test condition. A sensitivity analysis showed that measurement bias uncertainty is the most significant contributor to the stagnation-point pressure and heat flux variance for the low-enthalpy condition. For the high-enthalpy condition, a paradigm shift in sensitivities revealed the computational fluid dynamics model input uncertainty as the main contributor. A comparison between the prediction and measurement of the stagnation-point conditions under uncertainty showed that there was evidence of statistical disagreement. A validation metric was proposed and applied to the prediction uncertainty to account for the statistical disagreement when compared to the possible stagnation-point heat flux and pressure measurements

    Stochastic Model Updating with Uncertainty Quantification: An Overview and Tutorial

    Get PDF
    This paper presents an overview of the theoretic framework of stochastic model updating, including critical aspects of model parameterisation, sensitivity analysis, surrogate modelling, test-analysis correlation, parameter calibration, etc. Special attention is paid to uncertainty analysis, which extends model updating from the deterministic domain to the stochastic domain. This extension is significantly promoted by uncertainty quantification metrics, no longer describing the model parameters as unknown-but-fixed constants but random variables with uncertain distributions, i.e. imprecise probabilities. As a result, the stochastic model updating no longer aims at a single model prediction with maximum fidelity to a single experiment, but rather a reduced uncertainty space of the simulation enveloping the complete scatter of multiple experiment data. Quantification of such an imprecise probability requires a dedicated uncertainty propagation process to investigate how the uncertainty space of the input is propagated via the model to the uncertainty space of the output. The two key aspects, forward uncertainty propagation and inverse parameter calibration, along with key techniques such as P-box propagation, statistical distance-based metrics, Markov chain Monte Carlo sampling, and Bayesian updating, are elaborated in this tutorial. The overall technical framework is demonstrated by solving the NASA Multidisciplinary UQ Challenge 2014, with the purpose of encouraging the readers to reproduce the result following this tutorial. The second practical demonstration is performed on a newly designed benchmark testbed, where a series of lab-scale aeroplane models are manufactured with varying geometry sizes, following pre-defined probabilistic distributions, and tested in terms of their natural frequencies and model shapes. Such a measurement database contains naturally not only measurement errors but also, more importantly, controllable uncertainties from the pre-defined distributions of the structure geometry. Finally, open questions are discussed to fulfil the motivation of this tutorial in providing researchers, especially beginners, with further directions on stochastic model updating with uncertainty treatment perspectives

    Efficient uncertainty quantification in aerospace analysis and design

    Get PDF
    The main purpose of this study is to apply a computationally efficient uncertainty quantification approach, Non-Intrusive Polynomial Chaos (NIPC) based stochastic expansions, to robust aerospace analysis and design under mixed (aleatory and epistemic) uncertainties and demonstrate this technique on model problems and robust aerodynamic optimization. The proposed optimization approach utilizes stochastic response surfaces obtained with NIPC methods to approximate the objective function and the constraints in the optimization formulation. The objective function includes the stochastic measures which are minimized simultaneously to ensure the robustness of the final design to both aleatory and epistemic uncertainties. For model problems with mixed uncertainties, Quadrature-Based and Point-Collocation NIPC methods were used to create the response surfaces used in the optimization process. For the robust airfoil optimization under aleatory (Mach number) and epistemic (turbulence model) uncertainties, a combined Point-Collocation NIPC approach was utilized to create the response surfaces used as the surrogates in the optimization process. Two stochastic optimization formulations were studied: optimization under pure aleatory uncertainty and optimization under mixed uncertainty. As shown in this work for various problems, the NIPC method is computationally more efficient than Monte Carlo methods for moderate number of uncertain variables and can give highly accurate estimation of various metrics used in robust design optimization under mixed uncertainties. This study also introduces a new adaptive sampling approach to refine the Point-Collocation NIPC method for further improvement of the computational efficiency. Two numerical problems demonstrated that the adaptive approach can produce the same accuracy level of the response surface obtained with oversampling ratio of 2 using less function evaluations. --Abstract, page iii
    • …
    corecore