1,152 research outputs found

    Risk-informed decision-making in the presence of epistemic uncertainty

    Get PDF
    International audienceAn important issue in risk analysis is the distinction between epistemic and aleatory uncertainties. In this paper, the use of distinct representation formats for aleatory and epistemic uncertainties is advocated, the latter being modelled by sets of possible values. Modern uncertainty theories based on convex sets of probabilities are known to be instrumental for hybrid representations where aleatory and epistemic components of uncertainty remain distinct. Simple uncertainty representation techniques based on fuzzy intervals and p-boxes are used in practice. This paper outlines a risk analysis methodology from elicitation of knowledge about parameters to decision. It proposes an elicitation methodology where the chosen representation format depends on the nature and the amount of available information. Uncertainty propagation methods then blend Monte-Carlo simulation and interval analysis techniques. Nevertheless, results provided by these techniques, often in terms of probability intervals, may be too complex to interpret for a decision-maker and we therefore propose to compute a unique indicator of the likelihood of risk, called confidence index. It explicitly accounts for the decision-maker's attitude in the face of ambiguity. This step takes place at the end of the risk analysis process, when no further collection of evidence is possible that might reduce the ambiguity due to epistemic uncertainty. This last feature stands in contrast with the Bayesian methodology, where epistemic uncertainties on input parameters are modelled by single subjective probabilities at the beginning of the risk analysis process

    Stochastic and epistemic uncertainty propagation in LCA

    Get PDF
    Purpose: When performing uncertainty propagation, most LCA practitioners choose to represent uncertainties by single probability distributions and to propagate them using stochastic methods. However the selection of single probability distributions appears often arbitrary when faced with scarce information or expert judgement (epistemic uncertainty). Possibility theory has been developed over the last decades to address this problem. The objective of this study is to present a methodology that combines probability and possibility theories to represent stochastic and epistemic uncertainties in a consistent manner and apply it to LCA. A case study is used to show the uncertainty propagation performed with the proposed method and compare it to propagation performed using probability and possibility theories alone. Methods: Basic knowledge on the probability theory is first recalled, followed by a detailed description of hal-00811827, version 1- 11 Apr 2013 epistemic uncertainty representation using fuzzy intervals. The propagation methods used are the Monte Carlo analysis for probability distribution and an optimisation on alpha-cuts for fuzzy intervals. The proposed method (noted IRS) generalizes the process of random sampling to probability distributions as well as fuzzy intervals, thus making the simultaneous use of both representations possible

    Distribution-free stochastic simulation methodology for model updating under hybrid uncertainties

    Get PDF
    In the real world, a significant challenge faced in the safe operation and maintenance of infrastructures is the lack of available information or data. This results in a large degree of uncertainty and the requirement for robust and efficient uncertainty quantification (UQ) tools in order to derive the most realistic estimates of the behavior of structures. While the probabilistic approach has long been utilized as an essential tool for the quantitative mathematical representation of uncertainty, a common criticism is that the approach often involves insubstantiated subjective assumptions because of the scarcity or imprecision of available information. To avoid the inclusion of subjectivity, the concepts of imprecise probabilities have been developed, and the distributional probability-box (p-box) has gained the most attention among various types of imprecise probability models since it can straightforwardly provide a clear separation between aleatory and epistemic uncertainty. This thesis concerns the realistic consideration and numerically efficient calibraiton and propagation of aleatory and epistemic uncertainties (hybrid uncertainties) based on the distributional p-box. The recent developments including the Bhattacharyya distance-based approximate Bayesian computation (ABC) and non-intrusive imprecise stochastic simulation (NISS) methods have strengthened the subjective assumption-free approach for uncertainty calibration and propagation. However, these methods based on the distributional p-box stand on the availability of the prior knowledge determining a specific distribution family for the p-box. The target of this thesis is hence to develop a distribution-free approach for the calibraiton and propagation of hybrid uncertainties, strengthening the subjective assumption-free UQ approach. To achieve the above target, this thesis presents five main developments to improve the Bhattacharyya distance-based ABC and NISS frameworks. The first development is on improving the scope of application and efficiency of the Bhattacharyya distance-based ABC. The dimension reduction procedure is proposed to evaluate the Bhattacharyya distance when the system under investigation is described by time-domain sequences. Moreover, the efficient Bayesian inference method within the Bayesian updating with structural reliability methods (BUS) framework is developed by combining BUS with the adaptive Kriging-based reliability method, namely AK-MCMC. The second development of the distribution-free stochastic model updating framework is based on the combined application of the staircase density functions and the Bhattacharyya distance. The staircase density functions can approximate a wide range of distributions arbitrarily close; hence the development achieved to perform the Bhattacharyya distance-based ABC without limiting hypotheses on the distribution families of the parameters having to be updated. The aforementioned two developments are then integrated in the third development to provide a solution to the latest edition (2019) of the NASA UQ challenge problem. The model updating tasks under very challenging condition, where prior information of aleatory parameters are extremely limited other than a common boundary, are successfully addressed based on the above distribution-free stochastic model updating framework. Moreover, the NISS approach that simplifies the high-dimensional optimization to a set of one-dimensional searching by a first-order high-dimensional model representation (HDMR) decomposition with respect to each design parameter is developed to efficiently solve the reliability-based design optimization tasks. This challenge, at the same time, elucidates the limitations of the current developments, hence the fourth development aims at addressing the limitation that the staircase density functions are designed for univariate random variables and cannot acount for the parameter dependencies. In order to calibrate the joint distribution of correlated parameters, the distribution-free stochastic model updating framework is extended by characterizing the aleatory parameters using the Gaussian copula functions having marginal distributions as the staircase density functions. This further strengthens the assumption-free approach for uncertainty calibration in which no prior information of the parameter dependencies is required. Finally, the fifth development of the distribution-free uncertainty propagation framework is based on another application of the staircase density functions to the NISS class of methods, and it is applied for efficiently solving the reliability analysis subproblem of the NASA UQ challenge 2019. The above five developments have successfully strengthened the assumption-free approach for both uncertainty calibration and propagation thanks to the nature of the staircase density functions approximating arbitrary distributions. The efficiency and effectiveness of those developments are sufficiently demonstrated upon the real-world applications including the NASA UQ challenge 2019

    Uncertainty in Quantitative Risk Analysis - Characterisation and Methods of Treatment

    Get PDF
    The fundamental problems related to uncertainty in quantitative risk analyses, used in decision making in safety-related issues (for instance, in land use planning and licensing procedures for hazardous establishments and activities) are presented and discussed, together with the different types of uncertainty that are introduced in the various stages of an analysis. A survey of methods for the practical treatment of uncertainty, with emphasis on the kind of information that is needed for the different methods, and the kind of results they produce, is also presented. Furthermore, a thorough discussion of the arguments for and against each of the methods is given, and of different levels of treatment based on the problem under consideration. Recommendations for future research and standardisation efforts are proposed

    Quantification of uncertainty in probabilistic safety analysis

    Get PDF
    This thesis develops methods for quantification and interpretation of uncertainty in probabilistic safety analysis, focussing on fault trees. The output of a fault tree analysis is, usually, the probability of occurrence of an undesirable event (top event) calculated using the failure probabilities of identified basic events. The standard method for evaluating the uncertainty distribution is by Monte Carlo simulation, but this is a computationally intensive approach to uncertainty estimation and does not, readily, reveal the dominant reasons for the uncertainty. A closed form approximation for the fault tree top event uncertainty distribution, for models using only lognormal distributions for model inputs, is developed in this thesis. Its output is compared with the output from two sampling based approximation methods; standard Monte Carlo analysis, and Wilks’ method, which is based on order statistics using small sample sizes. Wilks’ method can be used to provide an upper bound for the percentiles of top event distribution, and is computationally cheap. The combination of the lognormal approximation and Wilks’ Method can be used to give, respectively, the overall shape and high confidence on particular percentiles of interest. This is an attractive, practical option for evaluation of uncertainty in fault trees and, more generally, uncertainty in certain multilinear models. A new practical method of ranking uncertainty contributors in lognormal models is developed which can be evaluated in closed form, based on cutset uncertainty. The method is demonstrated via examples, including a simple fault tree model and a model which is the size of a commercial PSA model for a nuclear power plant. Finally, quantification of “hidden uncertainties” is considered; hidden uncertainties are those which are not typically considered in PSA models, but may contribute considerable uncertainty to the overall results if included. A specific example of the inclusion of a missing uncertainty is explained in detail, and the effects on PSA quantification are considered. It is demonstrated that the effect on the PSA results can be significant, potentially permuting the order of the most important cutsets, which is of practical concern for the interpretation of PSA models. Finally, suggestions are made for the identification and inclusion of further hidden uncertainties.Open Acces

    Seismic risk of infrastructure systems with treatment of and sensitivity to epistemic uncertainty

    Get PDF
    Modern society’s very existence is tied to the proper and reliable functioning of its Critical Infrastructure (CI) systems. In the seismic risk assessment of an infrastructure, taking into account all the relevant uncertainties affecting the problem is crucial. While both aleatory and epistemic uncertainties affect the estimate of seismic risk to an infrastructure and should be considered, the focus herein is on the latter. After providing an up-to-date literature review about the treatment of and sensitivity to epistemic uncertainty, this paper presents a comprehensive framework for seismic risk assessment of interdependent spatially distributed infrastructure systems that accounts for both aleatory and epistemic uncertainties and provides confidence in the estimate, as well as sensitivity of uncertainty in the output to the components of epistemic uncertainty in the input. The logic tree approach is used for the treatment of epistemic uncertainty and for the sensitivity analysis, whose results are presented through tornado diagrams. Sensitivity is also evaluated by elaborating the logic tree results through weighted ANOVA. The formulation is general and can be applied to risk assessment problems involving not only infrastructural but also structural systems. The presented methodology was implemented into an open-source software, OOFIMS, and applied to a synthetic city composed of buildings and a gas network and subjected to seismic hazard. The gas system’s performance is assessed through a flow-based analysis. The seismic hazard, the vulnerability assessment and the evaluation of the gas system’s operational state are addressed with a simulation-based approach. The presence of two systems (buildings and gas network) proves the capability to handle system interdependencies and highlights that uncertainty in models/parameters related to one system can affect uncertainty in the output related to dependent systems

    NASA Risk-Informed Decision Making Handbook

    Get PDF
    This handbook provides guidance for conducting risk-informed decision making in the context of NASA risk management (RM), with a focus on the types of direction-setting key decisions that are characteristic of the NASA program and project life cycles, and which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant to be prescriptive. Instead, it is meant to be general enough, and contain a sufficient diversity of examples, to enable the reader to adapt the methods as needed to the particular decision problems that he or she faces. The handbook highlights major issues to consider when making decisions in the presence of potentially significant uncertainty, so that the user is better able to recognize and avoid pitfalls that might otherwise be experienced
    • 

    corecore