56 research outputs found

    Imprecise probabilistic evaluation of sewer flooding in urban drainage systems using random set theory

    Get PDF
    publication-status: Publishedtypes: ArticleCopyright © 2011 American Geophysical UnionUncertainty analysis is widely applied in water system modeling to quantify prediction uncertainty from models and data. Conventional methods typically handle various kinds of uncertainty using a single characterizing approach, be it probability theory or fuzzy set theory. However, using a single approach may not be appropriate, particularly when uncertainties are of different types. For example, in sewer flood estimation problems, random rainfall variables are used as model inputs and imprecise or subjective information is used to define model parameters. This paper presents a general framework for sewer flood estimation that enables simultaneous consideration of two types of uncertainty: randomness from rainfall data represented using imprecise probabilities and imprecision from model parameters represented by fuzzy numbers. These two types of uncertainties are combined using random set theory and then propagated through a hydrodynamic urban drainage model. Two propagation methods, i.e., discretization and Monte Carlo based methods, are presented and compared, with the latter shown to be much more computationally efficient and hence recommended for high-dimensional problems. The model output (flood depth) is generated in the form of lower and upper cumulative probabilities, which are best estimates given the various stochastic and epistemic uncertainties considered and which embrace the unknown true cumulative probability. The distance between the cumulative probabilities represents the extent of imprecise, incomplete, or conflicting information and can be reduced only when more knowledge is available. The proposed methodology has a more complete and thus more accurate representation of uncertainty in data and models and can effectively handle different uncertainty characterizations in a single, integrated framework for sewer flood estimation

    Uncertainty management in multidisciplinary design of critical safety systems

    Get PDF
    Managing the uncertainty in multidisciplinary design of safety-critical systems requires not only the availability of a single approach or methodology to deal with uncertainty but a set of different strategies and scalable computational tools (that is, by making use of the computational power of a cluster and grid computing). The availability of multiple tools and approaches for dealing with uncertainties allows cross validation of the results and increases the confidence in the performed analysis. This paper presents a unified theory and an integrated and open general-purpose computational framework to deal with scarce data, and aleatory and epistemic uncertainties. It allows solving of the different tasks necessary to manage the uncertainty, such as uncertainty characterization, sensitivity analysis, uncertainty quantification, and robust design. The proposed computational framework is generally applicable to solve different problems in different fields and be numerically efficient and scalable, allowing for a significant reduction of the computational time required for uncertainty management and robust design. The applicability of the proposed approach is demonstrated by solving a multidisciplinary design of a critical system proposed by NASA Langley Research Center in the multidisciplinary uncertainty quantification challenge problem

    Special Cases

    Get PDF
    International audienceThis chapter reviews special cases of lower previsions, that are instrumental in practical applications. We emphasize their various advantages and drawbacks, as well as the kind of problems in which they can be the most useful

    Flood analysis of urban drainage systems: probabilistic dependence structure of rainfall characteristics and fuzzy model parameters

    Get PDF
    Copyright © IWA Publishing 2013. The definitive peer-reviewed and edited version of this article is published in Journal of Hydroinformatics Vol. 15 No. 3 pp. 687–699 (2013), DOI:10.2166/hydro.2012.160 and is available at www.iwapublishing.comFlood analysis of urban drainage systems plays a crucial role for flood risk management in urban areas. Rainfall characteristics, including the dependence between rainfall variables, have a significant influence on flood frequency. This paper considers the use of copulas to represent the probabilistic dependence structure between rainfall depth and duration in the synthetic rainfall generation process, and the Gumbel copula is fitted for the rainfall data in a case study of sewer networks. The probabilistic representation of rainfall uncertainty is combined with fuzzy representation of model parameters in a unified framework based on Dempster–Shafer theory of evidence. The Monte Carlo simulation method is used for uncertainty propagation to calculate the exceedance probabilities of flood quantities (depth and volume) of the case study sewer network. This study demonstrates the suitability of the Gumbel copula in simulating the dependence of rainfall depth and duration, and also shows that the unified framework can effectively integrate the copula-based probabilistic representation of random variables and fuzzy representation of model parameters for flood analysis

    The Bhattacharyya distance: Enriching the P-box in stochastic sensitivity analysis

    Get PDF
    © 2019 Elsevier Ltd The tendency of uncertainty analysis has promoted the transformation of sensitivity analysis from the deterministic sense to the stochastic sense. This work proposes a stochastic sensitivity analysis framework using the Bhattacharyya distance as a novel uncertainty quantification metric. The Bhattacharyya distance is utilised to provide a quantitative description of the P-box in a two-level procedure for both aleatory and epistemic uncertainties. In the first level, the aleatory uncertainty is quantified by a Monte Carlo process within the probability space of the cumulative distribution function. For each sample of the Monte Carlo simulation, the second level is performed to propagate the epistemic uncertainty by solving an optimisation problem. Subsequently, three sensitivity indices are defined based on the Bhattacharyya distance, making it possible to rank the significance of the parameters according to the reduction and dispersion of the uncertainty space of the system outputs. A tutorial case study is provided in the first part of the example to give a clear understanding of the principle of the approach with reproducible results. The second case study is the NASA Langley challenge problem, which demonstrates the feasibility of the proposed approach, as well as the Bhattacharyya distance metric, in solving such a large-scale, strong-nonlinear, and complex problem

    A geometric and game-theoretic study of the conjunction of possibility measures

    Get PDF
    In this paper, we study the conjunction of possibility measures when they are interpreted as coherent upper probabilities, that is, as upper bounds for some set of probability measures. We identify conditions under which the minimum of two possibility measures remains a possibility measure. We provide graphical way to check these conditions, by means of a zero-sum game formulation of the problem. This also gives us a nice way to adjust the initial possibility measures so their minimum is guaranteed to be a possibility measure. Finally, we identify conditions under which the minimum of two possibility measures is a coherent upper probability, or in other words, conditions under which the minimum of two possibility measures is an exact upper bound for the intersection of the credal sets of those two possibility measures

    Efficient random set uncertainty quantification by means of advanced sampling techniques

    Get PDF
    In this dissertation, Random Sets and Advanced Sampling techniques are combined for general and efficient uncertainty quantification. Random Sets extend the traditional probabilistic framework, as they also comprise imprecision to account for scarce data, lack of knowledge, vagueness, subjectivity, etc. The general attitude of Random Sets to include different kinds of uncertainty is paid to a very high computational price. In fact, Random Sets requires a min-max convolution for each sample picked by the Monte Carlo method. The speed of the min-max convolution can be sensibly increased when the system response relationship is known in analytical form. However, in a general multidisciplinary design context, the system response is very often treated as a “black box”; thus, the convolution requires the adoption of evolutionary or stochastic algorithms, which need to be deployed for each Monte Carlo sample. Therefore, the availability of very efficient sampling techniques is paramount to allow Random Sets to be applied to engineering problems. In this dissertation, Advanced Line Sampling methods have been generalised and extended to include Random Sets. Advanced Sampling techniques make the estimation of quantiles on relevant probabilities extremely efficient, by requiring significantly fewer numbers of samples compared to standard Monte Carlo methods. In particular, the Line Sampling method has been enhanced to link well to the Random Set representation. These developments comprise line search, line selection, direction adaptation, and data buffering. The enhanced efficiency of Line Sampling is demonstrated by means of numerical and large scale finite element examples. With the enhanced algorithm, the connection between Line Sampling and the generalised uncertainty model has been possible, both in a Double Loop and in a Random Set approach. The presented computational strategies have been implemented in the open source general purpose software for uncertainty quantification, OpenCossan. The general reach of the proposed strategy is demonstrated by means of applications to structural reliability of a finite element model, to preventive maintenance, and to the NASA Langley multidisciplinary uncertainty quantification challenge

    Sklar's theorem in an imprecise setting

    Get PDF
    Sklar's theorem is an important tool that connects bidimensional distribution functions with their marginals by means of a copula. When there is imprecision about the marginals, we can model the available information by means of p-boxes, that are pairs of ordered distribution functions. Similarly, we can consider a set of copulas instead of a single one. We study the extension of Sklar's theorem under these conditions, and link the obtained results to stochastic ordering with imprecision

    Analyse de sensibilité des incertitudes paramétriques dans les évaluations d’aléas géotechniques

    Get PDF
    Epistemic uncertainty can be reduced via additional lab or in site measurements or additional numerical simulations. We focused here on parameter uncertainty: this corresponds to the incomplete knowledge of the correct setting of the input parameters (like values of soil properties) of the model supporting the geo-hazard assessment. A possible option tomanage it is via sensitivity analysis, which aims at identifying the contribution (i.e. the importance) of the different input parameters in the uncertainty on the final hazard outcome. For this purpose, advanced techniques exist, namely variance-based global sensitivity analysis. Yet, their practical implementation faces three major limitations related to the specificities of the geo-hazard domain: 1. the large computation time cost (several hours if not days) of numerical models; 2. the parameters are complex functions of time and space; 3. data are often scarce, limited if not vague. In the present PhD thesis, statistical approaches were developed, tested and adapted to overcome those limits. A special attention was paid to test the feasibility of those statistical tools by confronting them to real cases (natural hazards related to earthquakes, cavities and landslides).Les incertitudes épistémiques peuvent être réduites via des études supplémentaires (mesures labo, in situ, ou modélisations numériques, etc.). Nous nous concentrons ici sur celle "paramétrique" liée aux difficultés à évaluer quantitativement les paramètres d’entrée du modèle utilisé pour l’analysedes aléas géotechniques. Une stratégie de gestion possible est l’analyse de sensibilité, qui consiste à identifier la contribution (i.e. l’importance) des paramètres dans l’incertitude de l’évaluation de l’aléa. Des approches avancées existent pour conduire une telle analyse. Toutefois, leur applicationau domaine des aléas géotechniques se confronte à plusieurs contraintes : 1. le coût calculatoire des modèles numériques (plusieurs heures voire jours) ; 2. les paramètres sont souvent des fonctions complexes du temps et de l’espace ; 3. les données sont souvent limitées, imprécises voire vagues. Danscette thèse, nous avons testé et adapté des outils statistiques pour surmonter ces limites. Une attention toute particulière a été portée sur le test de faisabilité de ces procédures et sur la confrontation à des cas réels (aléas naturels liés aux séismes, cavités et glissements de terrain)
    • …
    corecore