496 research outputs found

    Epistemic Uncertainty Quantification in Scientific Models

    Get PDF
    In the field of uncertainty quantification (UQ), epistemic uncertainty often refers to the kind of uncertainty whose complete probabilistic description is not available, largely due to our lack of knowledge about the uncertainty. Quantification of the impacts of epistemic uncertainty is naturally difficult, because most of the existing stochastic tools rely on the specification of the probability distributions and thus do not readily apply to epistemic uncertainty. And there have been few studies and methods to deal with epistemic uncertainty. A recent work can be found in [J. Jakeman, M. Eldred, D. Xiu, Numerical approach for quantification of epistemic uncertainty, J. Comput. Phys. 229 (2010) 4648-4663], where a framework for numerical treatment of epistemic uncertainty was proposed. In this paper, firstly, we present a new method, similar to that of Jakeman et al. but significantly extending its capabilities. Most notably, the new method (1) does not require the encapsulation problem to be in a bounded domain such as a hypercube; (2) does not require the solution of the encapsulation problem to converge point-wise. In the current formulation, the encapsulation problem could reside in an unbounded domain, and more importantly, its numerical approximation could be sought in Lp norm. These features thus make the new approach more flexible and amicable to practical implementation. Both the mathematical framework and numerical analysis are presented to demonstrate the effectiveness of the new approach. And then, we apply this methods to work with one of the more restrictive uncertainty models, i.e., the fuzzy logic, where the p-distance, the weighted expected value and variance are defined to assess the accuracy of the solutions. At last, we give a brief introduction to our future work, which is epistemic uncertainty quantification using evidence theory

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019

    Stochastic simulation methods for structural reliability under mixed uncertainties

    Get PDF
    Uncertainty quantification (UQ) has been widely recognized as one of the most important, yet challenging task in both structural engineering and system engineering, and the current researches are mainly on the proper treatment of different types of uncertainties, resulting from either natural randomness or lack of information, in all related sub-problems of UQ such as uncertainty characterization, uncertainty propagation, sensitivity analysis, model updating, model validation, risk and reliability analysis, etc. It has been widely accepted that those uncertainties can be grouped as either aleatory uncertainty or epistemic uncertainty, depending on whether they are reducible or not. For dealing with the above challenge, many non-traditional uncertainty characterization models have been developed, and those models can be grouped as either imprecise probability models (e.g., probability-box model, evidence theory, second-order probability model and fuzzy probability model) or non-probabilistic models (e.g., interval/convex model and fuzzy set theory). This thesis concerns the efficient numerical propagation of the three kinds of uncertainty characterization models, and for simplicity, the precise probability model, the distribution probability-box model, and the interval model are taken as examples. The target is to develop efficient numerical algorithms for learning the functional behavior of the probabilistic responses (e.g., response moments and failure probability) with respect to the epistemic parameters of model inputs, which is especially useful for making reliable decisions even when the available information on model inputs is imperfect. To achieve the above target, my thesis presents three main developments for improving the Non-intrusive Imprecise Stochastic Simulation (NISS), which is a general methodology framework for propagating the imprecise probability models with only one stochastic simulation. The first development is on generalizing the NISS methods to the problems with inputs including both imprecise probability models and non-probability models. The algorithm is established by combining Bayes rule and kernel density estimation. The sensitivity indices of the epistemic parameters are produced as by-products. The NASA Langley UQ challenge is then successfully solved by using the generalized NISS method. The second development is to inject the classical line sampling to the NISS framework so as to substantially improve the efficiency of the algorithm for rare failure event analysis, and two strategies, based on different interpretations of line sampling, are developed. The first strategy is based on the hyperplane approximations, while the second-strategy is derived based on the one-dimensional integrals. Both strategies can be regarded as post-processing of the classical line sampling, while the results show that their resultant NISS estimators have different performance. The third development aims at further substantially improving the efficiency and suitability to highly nonlinear problems of line sampling, for complex structures and systems where one deterministic simulation may take hours. For doing this, the active learning strategy based on Gaussian process regression is embedded into the line sampling procedure for accurately estimating the interaction point for each sample line, with only a small number of deterministic simulations. The above three developments have largely improved the suitability and efficiency of the NISS methods, especially for real-world engineering applications. The efficiency and effectiveness of those developments are clearly interpreted with toy examples and sufficiently demonstrated by real-world test examples in system engineering, civil engineering, and mechanical engineering

    Numerical treatment of imprecise random fields in non-linear solid mechanics

    Get PDF
    The quantification and propagation of mixed uncertain material parameters in the context of solid mechanical finite element simulations is studied. While aleatory uncertainties appear in terms of spatial varying parameters, i.e. random fields, the epistemic character is induced by a lack of knowledge regarding the correlation length, which is therefore described by interval values. The concept and description of the resulting imprecise random fields is introduced in detail. The challenges occurring from interval valued correlation lengths are clarified. These include mainly the stochastic dimension, which can become very high under some circumstances, as well as the comparability of different correlation length scenarios with regard to the underlying truncation error of the applied Karhunen-Loève expansion. Additionally, the computation time can increase drastically, if the straightforward and robust double loop approach is applied. Sparse stochastic collocation method and sparse polynomial chaos expansion are studied to reduce the number of required sample evaluations, i.e. the computational cost. To keep the stochastic dimension as low as possible, the random fields are described by Karhunen-Loève expansion, using a modified exponential correlation kernel, which is advantageous in terms of a fast convergence while providing an analytic solution. Still, for small correlation lengths, the investigated approaches are limited by the curse of dimensionality. Furthermore, they turn out to be not suited for non-linear material models. As a straightforward alternative, a decoupled interpolation approach is proposed, offering a practical engineering estimate. For this purpose, the uncertain quantities only need to be propagated as a random variable and deterministically in terms of the mean values. From these results, the so-called absolutely no idea probability box (ani-p-box) can be obtained, bounding the results of the interval valued correlation length being between zero and infinity. The idea is, to interpolate the result of any arbitrary correlation length within this ani-p-box, exploiting prior knowledge about the statistical behaviour of the input random field corresponding to the correlation length. The new approach is studied for one- and two-dimensional random fields. Furthermore, linear and non-linear finite element models are used in terms of linear-elastic or elasto-plastic material laws, the latter including linear hardening. It appears that the approach only works satisfyingly for sufficiently smooth responses but an improvement by considering also higher order statistics is motivated for future research.DFG/SPP 1886/NA330/12-1/E

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem
    corecore