1,115 research outputs found

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019

    Quantification of uncertainty in aerodynamic heating of a reentry vehicle due to uncertain wall and freestream conditions

    Get PDF
    The primary focus of this study is to demonstrate an efficient approach for uncertainty quantification of surface heat flux to the spherical non-ablating heatshield of a generic reentry vehicle due to epistemic and aleatory uncertainties that may exist in various parameters used in the numerical solution of hypersonic, viscous, laminar blunt-body flows with thermo-chemical non-equilibrium. Two main uncertainty sources were treated in the computational fluid dynamics (CFD) simulations: (1) aleatory uncertainty in the freestream velocity and (2) epistemic uncertainty in the recombination efficiency for a partially catalytic wall boundary condition. The Second-Order Probability utilizing a stochastic response surface obtained with Point-Collocation Non-Intrusive Polynomial Chaos was used for the propagation of mixed (aleatory and epistemic) uncertainties. The uncertainty quantication approach was validated on a stochastic model problem with mixed uncertainties for the prediction of stagnation point heat transfer with Fay-Riddell relation, which included the comparison with direct Monte Carlo sampling results. In the stochastic CFD problem, the uncertainty in surface heat transfer was obtained in terms of intervals at different probability levels at various locations including the stagnation point and the shoulder region. The mixed uncertainty results were compared to the results obtained with a purely aleatory uncertainty analysis to show the difference between two uncertainty quantication approaches. A global sensitivity analysis indicated that the velocity has a stronger contribution to the overall uncertainty in the stagnation point heat transfer for the range of input uncertainties considered in this study --Abstract, page iii

    Stochastic Model Updating with Uncertainty Quantification: An Overview and Tutorial

    Get PDF
    This paper presents an overview of the theoretic framework of stochastic model updating, including critical aspects of model parameterisation, sensitivity analysis, surrogate modelling, test-analysis correlation, parameter calibration, etc. Special attention is paid to uncertainty analysis, which extends model updating from the deterministic domain to the stochastic domain. This extension is significantly promoted by uncertainty quantification metrics, no longer describing the model parameters as unknown-but-fixed constants but random variables with uncertain distributions, i.e. imprecise probabilities. As a result, the stochastic model updating no longer aims at a single model prediction with maximum fidelity to a single experiment, but rather a reduced uncertainty space of the simulation enveloping the complete scatter of multiple experiment data. Quantification of such an imprecise probability requires a dedicated uncertainty propagation process to investigate how the uncertainty space of the input is propagated via the model to the uncertainty space of the output. The two key aspects, forward uncertainty propagation and inverse parameter calibration, along with key techniques such as P-box propagation, statistical distance-based metrics, Markov chain Monte Carlo sampling, and Bayesian updating, are elaborated in this tutorial. The overall technical framework is demonstrated by solving the NASA Multidisciplinary UQ Challenge 2014, with the purpose of encouraging the readers to reproduce the result following this tutorial. The second practical demonstration is performed on a newly designed benchmark testbed, where a series of lab-scale aeroplane models are manufactured with varying geometry sizes, following pre-defined probabilistic distributions, and tested in terms of their natural frequencies and model shapes. Such a measurement database contains naturally not only measurement errors but also, more importantly, controllable uncertainties from the pre-defined distributions of the structure geometry. Finally, open questions are discussed to fulfil the motivation of this tutorial in providing researchers, especially beginners, with further directions on stochastic model updating with uncertainty treatment perspectives

    Efficient Uncertainty Quantification & Sensitivity Analysis for Hypersonic Flow and Material Response Simulations Under Inherent and Model-Form Uncertainties

    Get PDF
    Accurate numerical prediction of coupled hypersonic flow fields and ablative TPS material response is challenging due to the complex nature of the physics. The uncertainties associated with various physical models used in high-enthalpy hypersonic flow and material response simulations can have significant effects on the accuracy of the results including the heat-flux and temperature distributions in various layers of ablating TPS material. These uncertainties can arise from the lack of knowledge in physical modeling (model-form or epistemic uncertainty) or inherent variations in the model inputs (aleatory or probabilistic uncertainty). It is important to include both types of uncertainty in the simulations to properly assess the accuracy of the results and to design robust and reliable TPS for reentry or hypersonic cruise vehicles. In addition to the quantification of uncertainties, global sensitivity information for the output quantities of interest play an important role for the ranking of the contribution of each uncertainty source to the overall uncertainty, which may be used for the proper allocation of resources in the improvement of the physical models or reduce the number of uncertain variables to be considered in the uncertainty analysis. The uncertainty quantification for coupled high-fidelity hypersonic flow and material response predictions can be challenging due to the computational expense of the simulations, existence of both model-form and inherent uncertainty sources, large number of uncertain variables, and highly non-linear relations between the uncertain variables and the output response variables. The objective of this talk will be to introduce a computationally efficient and accurate uncertainty quantification (UQ) and global sensitivity analysis approach for potential application to coupled aerothermodynamics and material response simulations, which is being developed to address the aforementioned challenges. The UQ approach to be described is based on the second-order uncertainty quantification theory utilizing a stochastic response surface obtained with non-intrusive polynomial chaos and is capable of efficiently propagating both the inherent and the model-form uncertainties in the physical models. The non-intrusive nature of the UQ approach requires no modification to the deterministic codes, which is a significant benefit for the complex numerical simulation considered in this problem. The global non-linear sensitivity analysis to be introduced is based on variance decomposition, which again utilizes the polynomial chaos expansions. In addition to the description of the UQ approach, the talk will also include the presentation of UQ results from a recent demonstration of the methodology, which included the uncertainty quantification and sensitivity analysis of surface heat-flux on the spherical heat shield of a reentry vehicle (a case selected from CUBRC experimental database). This study involved the use of NASA DPLR code and the treatment of the free-stream velocity (inherent uncertainty), collision integrals for the transport coefficients (model-form uncertainty), and the surface catalysis (model-form uncertainty) as uncertain variables. The talk will also include the description of an adaptive UQ framework being developed as part of a NASA JPL STTR project to quantify the uncertainty in multi-physics spacecraft simulations with large number of uncertain variables

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem

    Efficient uncertainty quantification in aerospace analysis and design

    Get PDF
    The main purpose of this study is to apply a computationally efficient uncertainty quantification approach, Non-Intrusive Polynomial Chaos (NIPC) based stochastic expansions, to robust aerospace analysis and design under mixed (aleatory and epistemic) uncertainties and demonstrate this technique on model problems and robust aerodynamic optimization. The proposed optimization approach utilizes stochastic response surfaces obtained with NIPC methods to approximate the objective function and the constraints in the optimization formulation. The objective function includes the stochastic measures which are minimized simultaneously to ensure the robustness of the final design to both aleatory and epistemic uncertainties. For model problems with mixed uncertainties, Quadrature-Based and Point-Collocation NIPC methods were used to create the response surfaces used in the optimization process. For the robust airfoil optimization under aleatory (Mach number) and epistemic (turbulence model) uncertainties, a combined Point-Collocation NIPC approach was utilized to create the response surfaces used as the surrogates in the optimization process. Two stochastic optimization formulations were studied: optimization under pure aleatory uncertainty and optimization under mixed uncertainty. As shown in this work for various problems, the NIPC method is computationally more efficient than Monte Carlo methods for moderate number of uncertain variables and can give highly accurate estimation of various metrics used in robust design optimization under mixed uncertainties. This study also introduces a new adaptive sampling approach to refine the Point-Collocation NIPC method for further improvement of the computational efficiency. Two numerical problems demonstrated that the adaptive approach can produce the same accuracy level of the response surface obtained with oversampling ratio of 2 using less function evaluations. --Abstract, page iii
    • …
    corecore