3 research outputs found

    New statistical methodology for second-level global sensitivity analysis of numerical simulators

    No full text
    International audienceGlobal sensitivity analysis (GSA) of numerical simulators aims at studying the global impact of the input uncertainties on the output. To perform the GSA, statistical tools based on inputsoutput dependence measures are commonly used. We focus here on dependence measures based on reproducing kernel Hilbert spaces the Hilbert-Schmidt Independence Criterion denoted HSIC. Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify the global impact of this uncertainty on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a double Monte Carlo loop, requires a large number of model evaluations which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a single Monte Carlo loop with a limited calculation budget. Firstly, we build a unique sample of inputs from a well chosen probability distribution and the associated code outputs are computed. From this inputsoutput sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted estimators are demonstrated. Finally, we define 2nd-level HSIC-based measures between the probability distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided

    méthodologie basée sur les mesures de dépendance HSIC pour l'analyse de sensibilité de second niveau

    Get PDF
    International audienceWe are interested in the sensitivity analysis of numerical simulators in the case where the probability distributions of the uncertain input variables are themselves (even partially) unknown. The objective is to quantify the impact of these uncertainties on the sensitivity analysis results. To achieve it, we propose a single Monte Carlo loop methodology based on Hilbert-Schmidt dependence measures and importance sampling techniques. This approach significantly limits the number of simulator evaluations. A numerical application is proposed to illustrate the whole methodology, while comparing its different options.Nous nous intéressons à l'analyse de sensibilité des simulateurs numériques dans le cas où les distributions de probabilités des variables d'entrées incertaines sont elles-mêmes méconnues. L'objectif est de quantifier l'impact de ces incertitudes sur les résultats de l'analyse de sensibilité. Pour cela, on propose une méthodologie de type simple boucle Monte-Carlo basée sur les mesures de dépendance de Hilbert-Schmidt et inspirée des techniques de tirage d'importance, cette approche permettant de limiter significativement le nombre d'évaluations du simulateur. Une application numérique est proposée pour illustrer l'ensemble de la méthodologie et tester ses différentes options

    Advanced Studies and Statistical Treatment for Sodium-Cooled Fast Reactor Pin Failures During Unprotected Transient Overpower Accident

    No full text
    International audienceUsually, simulation tools are validated on experimental data considering a Best Estimate simulation case and there is no quantification of this validation, which remains based on a rough expert judgment. This paper presents an advanced validation treatment of the simulation tool OCARINa, devoted to Unprotected Transient OverPower (UTOP) accidents, on two CABRI tests, considering this time, a Best Estimate Plus Uncertainties (BEPU) approach. The output results of interest are both scalar physical data such as the time and location of the pin failure and associated molten mass and vector data such as temperature axial distribution or temperature evolution versus time. This approach is a first step in quantifying the degree of agreement between the calculation results and the experimental results. It is of great interest for the VV&UQ (Verification, Validation and Uncertainty Quantification) approach, which leads to the qualification of scientific calculation tools
    corecore