3 research outputs found

    The Bhattacharyya distance: Enriching the P-box in stochastic sensitivity analysis

    Get PDF
    © 2019 Elsevier Ltd The tendency of uncertainty analysis has promoted the transformation of sensitivity analysis from the deterministic sense to the stochastic sense. This work proposes a stochastic sensitivity analysis framework using the Bhattacharyya distance as a novel uncertainty quantification metric. The Bhattacharyya distance is utilised to provide a quantitative description of the P-box in a two-level procedure for both aleatory and epistemic uncertainties. In the first level, the aleatory uncertainty is quantified by a Monte Carlo process within the probability space of the cumulative distribution function. For each sample of the Monte Carlo simulation, the second level is performed to propagate the epistemic uncertainty by solving an optimisation problem. Subsequently, three sensitivity indices are defined based on the Bhattacharyya distance, making it possible to rank the significance of the parameters according to the reduction and dispersion of the uncertainty space of the system outputs. A tutorial case study is provided in the first part of the example to give a clear understanding of the principle of the approach with reproducible results. The second case study is the NASA Langley challenge problem, which demonstrates the feasibility of the proposed approach, as well as the Bhattacharyya distance metric, in solving such a large-scale, strong-nonlinear, and complex problem

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem
    corecore