1,172 research outputs found

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019

    Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    Get PDF
    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling

    Development and Use of Engineering Standards for Computational Fluid Dynamics for Complex Aerospace Systems

    Get PDF
    Computational fluid dynamics (CFD) and other advanced modeling and simulation (M&S) methods are increasingly relied on for predictive performance, reliability and safety of engineering systems. Analysts, designers, decision makers, and project managers, who must depend on simulation, need practical techniques and methods for assessing simulation credibility. The AIAA Guide for Verification and Validation of Computational Fluid Dynamics Simulations (AIAA G-077-1998 (2002)), originally published in 1998, was the first engineering standards document available to the engineering community for verification and validation (V&V) of simulations. Much progress has been made in these areas since 1998. The AIAA Committee on Standards for CFD is currently updating this Guide to incorporate in it the important developments that have taken place in V&V concepts, methods, and practices, particularly with regard to the broader context of predictive capability and uncertainty quantification (UQ) methods and approaches. This paper will provide an overview of the changes and extensions currently underway to update the AIAA Guide. Specifically, a framework for predictive capability will be described for incorporating a wide range of error and uncertainty sources identified during the modeling, verification, and validation processes, with the goal of estimating the total prediction uncertainty of the simulation. The Guide's goal is to provide a foundation for understanding and addressing major issues and concepts in predictive CFD. However, this Guide will not recommend specific approaches in these areas as the field is rapidly evolving. It is hoped that the guidelines provided in this paper, and explained in more detail in the Guide, will aid in the research, development, and use of CFD in engineering decision-making

    Seismic risk of infrastructure systems with treatment of and sensitivity to epistemic uncertainty

    Get PDF
    Modern society’s very existence is tied to the proper and reliable functioning of its Critical Infrastructure (CI) systems. In the seismic risk assessment of an infrastructure, taking into account all the relevant uncertainties affecting the problem is crucial. While both aleatory and epistemic uncertainties affect the estimate of seismic risk to an infrastructure and should be considered, the focus herein is on the latter. After providing an up-to-date literature review about the treatment of and sensitivity to epistemic uncertainty, this paper presents a comprehensive framework for seismic risk assessment of interdependent spatially distributed infrastructure systems that accounts for both aleatory and epistemic uncertainties and provides confidence in the estimate, as well as sensitivity of uncertainty in the output to the components of epistemic uncertainty in the input. The logic tree approach is used for the treatment of epistemic uncertainty and for the sensitivity analysis, whose results are presented through tornado diagrams. Sensitivity is also evaluated by elaborating the logic tree results through weighted ANOVA. The formulation is general and can be applied to risk assessment problems involving not only infrastructural but also structural systems. The presented methodology was implemented into an open-source software, OOFIMS, and applied to a synthetic city composed of buildings and a gas network and subjected to seismic hazard. The gas system’s performance is assessed through a flow-based analysis. The seismic hazard, the vulnerability assessment and the evaluation of the gas system’s operational state are addressed with a simulation-based approach. The presence of two systems (buildings and gas network) proves the capability to handle system interdependencies and highlights that uncertainty in models/parameters related to one system can affect uncertainty in the output related to dependent systems

    ESD Reviews: Model Dependence in Multi-Model Climate Ensembles: Weighting, Sub-Selection and Out-Of-Sample Testing

    Get PDF
    The rationale for using multi-model ensembles in climate change projections and impacts research is often based on the expectation that different models constitute independent estimates; therefore, a range of models allows a better characterisation of the uncertainties in the representation of the climate system than a single model. However, it is known that research groups share literature, ideas for representations of processes, parameterisations, evaluation data sets and even sections of model code. Thus, nominally different models might have similar biases because of similarities in the way they represent a subset of processes, or even be near-duplicates of others, weakening the assumption that they constitute independent estimates. If there are near-replicates of some models, then treating all models equally is likely to bias the inferences made using these ensembles. The challenge is to establish the degree to which this might be true for any given application. While this issue is recognised by many in the community, quantifying and accounting for model dependence in anything other than an ad-hoc way is challenging. Here we present a synthesis of the range of disparate attempts to define, quantify and address model dependence in multi-model climate ensembles in a common conceptual framework, and provide guidance on how users can test the efficacy of approaches that move beyond the equally weighted ensemble. In the upcoming Coupled Model Intercomparison Project phase 6 (CMIP6), several new models that are closely related to existing models are anticipated, as well as large ensembles from some models. We argue that quantitatively accounting for dependence in addition to model performance, and thoroughly testing the effectiveness of the approach used will be key to a sound interpretation of the CMIP ensembles in future scientific studies

    Advancements in uncertainty quantification with stochastic expansions applied to supersonic and hypersonic flows

    Get PDF
    The primary objective of this study was to develop improved methodologies for efficient and accurate uncertainty quantification with stochastic expansions and apply them to problems in supersonic and hypersonic flows. Methods introduced included approaches for efficient dimension reduction, sensitivity analysis, and sparse approximations. These methods and procedures were demonstrated on multiple stochastic models of hypersonic, planetary entry flows, which included high-fidelity, computational fluid dynamics models of radiative heating on the surface of hypersonic inflatable aerodynamic decelerators during Mars and Titan entry. For these stochastic problems, construction of an accurate surrogate model was achieved with as few as 10% of the number of model evaluations needed to construct a full dimension, total order expansion. Another objective of this work was to introduce methodologies used for further advancement of a quantification of margins and uncertainties framework. First, the use of stochastic expansions was introduced to efficiently quantify the uncertainty in system design performance metrics and performance boundaries. Then, procedures were defined to measure margin and uncertainty metrics for systems subject to multiple types of uncertainty in operating conditions and physical models. To demonstrate the new quantification of margins and uncertainties methodologies, two multi-system, multi-physics stochastic models were investigated: (1) a model for reentry dynamics, control, and convective heating and (2) a model of ground noise prediction of low-boom, supersonic aircraft configurations. Overall the methods and results of this work have outlined many effective approaches to uncertainty quantification of large-scale, high-dimension, aerospace problems containing both epistemic and inherent uncertainty. The methods presented showed significant improvement in the efficiency and accuracy of uncertainty analysis capability when stochastic expansions were used for uncertainty quantification. --Abstract, page iii

    Stochastic Model Updating with Uncertainty Quantification: An Overview and Tutorial

    Get PDF
    This paper presents an overview of the theoretic framework of stochastic model updating, including critical aspects of model parameterisation, sensitivity analysis, surrogate modelling, test-analysis correlation, parameter calibration, etc. Special attention is paid to uncertainty analysis, which extends model updating from the deterministic domain to the stochastic domain. This extension is significantly promoted by uncertainty quantification metrics, no longer describing the model parameters as unknown-but-fixed constants but random variables with uncertain distributions, i.e. imprecise probabilities. As a result, the stochastic model updating no longer aims at a single model prediction with maximum fidelity to a single experiment, but rather a reduced uncertainty space of the simulation enveloping the complete scatter of multiple experiment data. Quantification of such an imprecise probability requires a dedicated uncertainty propagation process to investigate how the uncertainty space of the input is propagated via the model to the uncertainty space of the output. The two key aspects, forward uncertainty propagation and inverse parameter calibration, along with key techniques such as P-box propagation, statistical distance-based metrics, Markov chain Monte Carlo sampling, and Bayesian updating, are elaborated in this tutorial. The overall technical framework is demonstrated by solving the NASA Multidisciplinary UQ Challenge 2014, with the purpose of encouraging the readers to reproduce the result following this tutorial. The second practical demonstration is performed on a newly designed benchmark testbed, where a series of lab-scale aeroplane models are manufactured with varying geometry sizes, following pre-defined probabilistic distributions, and tested in terms of their natural frequencies and model shapes. Such a measurement database contains naturally not only measurement errors but also, more importantly, controllable uncertainties from the pre-defined distributions of the structure geometry. Finally, open questions are discussed to fulfil the motivation of this tutorial in providing researchers, especially beginners, with further directions on stochastic model updating with uncertainty treatment perspectives

    Uncertainty Quantification in Machine Learning for Engineering Design and Health Prognostics: A Tutorial

    Full text link
    On top of machine learning models, uncertainty quantification (UQ) functions as an essential layer of safety assurance that could lead to more principled decision making by enabling sound risk assessment and management. The safety and reliability improvement of ML models empowered by UQ has the potential to significantly facilitate the broad adoption of ML solutions in high-stakes decision settings, such as healthcare, manufacturing, and aviation, to name a few. In this tutorial, we aim to provide a holistic lens on emerging UQ methods for ML models with a particular focus on neural networks and the applications of these UQ methods in tackling engineering design as well as prognostics and health management problems. Toward this goal, we start with a comprehensive classification of uncertainty types, sources, and causes pertaining to UQ of ML models. Next, we provide a tutorial-style description of several state-of-the-art UQ methods: Gaussian process regression, Bayesian neural network, neural network ensemble, and deterministic UQ methods focusing on spectral-normalized neural Gaussian process. Established upon the mathematical formulations, we subsequently examine the soundness of these UQ methods quantitatively and qualitatively (by a toy regression example) to examine their strengths and shortcomings from different dimensions. Then, we review quantitative metrics commonly used to assess the quality of predictive uncertainty in classification and regression problems. Afterward, we discuss the increasingly important role of UQ of ML models in solving challenging problems in engineering design and health prognostics. Two case studies with source codes available on GitHub are used to demonstrate these UQ methods and compare their performance in the life prediction of lithium-ion batteries at the early stage and the remaining useful life prediction of turbofan engines
    • …
    corecore