53 research outputs found

    Comparison of bootstrapped artificial neural networks and quadratic response surfaces for the estimation of the functional failure probability of a thermal-hydraulic passive system

    No full text
    International audienceIn this work, bootstrapped artificial neural network (ANN) and quadratic response surface (RS) empirical regression models are used as fast-running surrogates of a thermal-hydraulic (T-H) system code to reduce the computational burden associated with estimation of functional failure probability of a T-H passive system. The ANN and quadratic RS models are built on a few data representative of the input/output nonlinear relationships underlying the T-H code. Once built, these models are used for performing, in reasonable computational time, the numerous system response calculations required for failure probability estimation. A bootstrap of the regression models is implemented for quantifying, in terms of confidence intervals, the uncertainties associated with the estimates provided by ANNs and RSs. The alternative empirical models are compared on a case study of an emergency passive decay heat removal system of a gas-cooled fast reactor (GFR)

    How to effectively compute the reliability of a thermal-hydraulic nuclear passive system

    No full text
    International audienceThe computation of the reliability of a thermal-hydraulic (T-H) passive system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. The objective of this work is to provide operative guidelines to effectively handle the computation of the reliability of a nuclear passive system. Two directions of computation efficiency are considered: from one side, efficient Monte Carlo Simulation (MCS) techniques are indicated as a means to performing robust estimations with a limited number of samples: in particular, the Subset Simulation (SS) and Line Sampling (LS) methods are identified as most valuable; from the other side, fast-running, surrogate regression models (also called response surfaces or meta-models) are indicated as a valid replacement of the long-running T-H model codes: in particular, the use of bootstrapped Artificial Neural Networks (ANNs) is shown to have interesting potentials, including for uncertainty propagation.The recommendations drawn are supported by the results obtained in an illustrative application of literature

    Quantitative functional failure analysis of a thermal-hydraulic passive system by means of bootstrapped Artificial Neural Networks

    No full text
    International audienceThe estimation of the functional failure probability of a thermal-hydraulic (T-H) passive system can be done by Monte Carlo (MC) sampling of the epistemic uncertainties affecting the system model and the numerical values of its parameters, followed by the computation of the system response by a mechanistic T-H code, for each sample. The computational effort associated to this approach can be prohibitive because a large number of lengthy T-H code simulations must be performed (one for each sample) for accurate quantification of the functional failure probability and the related statistics. In this paper, the computational burden is reduced by replacing the long-running, original T-H code by a fast-running, empirical regression model: in particular, an Artificial Neural Network (ANN) model is considered. It is constructed on the basis of a limited-size set of data representing examples of the input/output nonlinear relationships underlying the original T-H code; once the model is built, it is used for performing, in an acceptable computational time, the numerous system response calculations needed for an accurate failure probability estimation, uncertainty propagation and sensitivity analysis. The empirical approximation of the system response provided by the ANN model introduces an additional source of (model) uncertainty, which needs to be evaluated and accounted for. A bootstrapped ensemble of ANN regression models is here built for quantifying, in terms of confidence intervals, the (model) uncertainties associated with the estimates provided by the ANNs. For demonstration purposes, an application to the functional failure analysis of an emergency passive decay heat removal system in a simple steady-state model of a Gas-cooled Fast Reactor (GFR) is presented. The functional failure probability of the system is estimated together with global Sobol sensitivity indices. The bootstrapped ANN regression model built with low computational time on few (e.g., 100) data examples is shown capable of providing reliable (very near to the true values of the quantities of interest) and robust (the confidence intervals are satisfactorily narrow around the true values of the quantities of interest) point estimates

    Uncertainty and sensitivity analysis for long-running computer codes : a critical review

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering, 2010."February 2010." Cataloged from PDF version of thesis.Includes bibliographical references (p. 137-146).This thesis presents a critical review of existing methods for performing probabilistic uncertainty and sensitivity analysis for complex, computationally expensive simulation models. Uncertainty analysis (UA) methods reviewed include standard Monte Carlo simulation, Latin Hypercube sampling, importance sampling, line sampling, and subset simulation. Sensitivity analysis (SA) methods include scatter plots, Monte Carlo filtering, regression analysis, variance-based methods (Sobol' sensitivity indices and Sobol' Monte Carlo algorithms), and Fourier amplitude sensitivity tests. In addition, this thesis reviews several existing metamodeling techniques that are intended provide quick-running approximations to the computer models being studied. Because stochastic simulation-based UA and SA rely on a large number (e.g., several thousands) of simulations, metamodels are recognized as a necessary compromise when UA and SA must be performed with long-running (i.e., several hours or days per simulation) computational models. This thesis discusses the use of polynomial Response Surfaces (RS), Artificial Neural Networks (ANN), and Kriging/Gaussian Processes (GP) for metamodeling. Moreover, two methods are discussed for estimating the uncertainty introduced by the metamodel. The first of these methods is based on a bootstrap sampling procedure, and can be utilized for any metamodeling technique.(cont.) The second method is specific to GP models, and is based on a Bayesian interpretation of the underlying stochastic process. Finally, to demonstrate the use of these methods, the results from two case studies involving the reliability assessment of passive nuclear safety systems are presented. The general conclusions of this work are that polynomial RSs are frequently incapable of adequately representing the complex input/output behavior exhibited by many mechanistic models. In addition, the goodness-of- fit of the RS should not be misinterpreted as a measure of the predictive capability of the metamodel, since RSs are necessarily biased predictors for deterministic computer models. Furthermore, the extent of this bias is not measured by standard goodness-of-fit metrics (e.g., coefficient of determination, R 2), so these methods tend to provide overly optimistic indications of the quality of the metamodel. The bootstrap procedure does provide indication as to the extent of this bias, with the bootstrap confidence intervals for the RS estimates generally being significantly wider than those of the alternative metamodeling methods. It has been found that the added flexibility afforded by ANNs and GPs can make these methods superior for approximating complex models. In addition, GPs are exact interpolators, which is an important feature when the underlying computer model is deterministic (i.e., when there is no justification for including a random error component in the metamodel). On the other hand, when the number of observations from the computer model is sufficiently large, all three methods appear to perform comparably, indicating that in such cases, RSs can still provide useful approximations.by Dustin R. Langewisch.S.M

    INTEGRATED DETERMINISTIC AND PROBABILISTIC SAFETY ANALYSIS: CONCEPTS, CHALLENGES, RESEARCH DIRECTIONS

    No full text
    International audienceIntegrated deterministic and probabilistic safety analysis (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives

    Robust artificial neural network for reliability and sensitivity analyses of complex non-linear systems

    Get PDF
    Artificial Neural Networks (ANNs) are commonly used in place of expensive models to reduce the computational burden required for uncertainty quantifcation, reliability and sensitivity analysis. ANN with selected architecture is trained with the back-propagation algorithm from few data representatives of the input/output relationship of the underlying model of interest. However, different performing ANNs might be obtained with the same training data as a result of the random initialization of the weight parameters in each of the network, leading to an uncertainty in selecting the best performing ANN. On the other hand, using cross-validation to select the best performing ANN based on the ANN with the highest R2 value can lead to biassing in the prediction. This is as a result of the fact that the use of R2 cannot determine if the prediction made by ANN is biased. Additionally, R2 does not indicate if a model is adequate, as it is possible to have a low R2 for a good model and a high R2 for a bad model. Hence in this paper, we propose an approach to improve the robustness of a prediction made by ANN. The approach is based on a systematic combination of identical trained ANNs, by coupling the Bayesian framework and model averaging. Additionally, the uncertainties of the robust prediction derived from the approach are quantified in terms of condence intervals. To demonstrate the applicability of the proposed approach, two synthetic numerical examples are presented. Finally, the proposed approach is used to perform a reliability and sensitivity analysis on a process simulation model of a UK nuclear effluent treatment plant developed by National Nuclear Laboratory (NNL) and treated in this study as a black-box employing a set of training data as a test case. This model has been extensively validated against plant and experimental data and used to support the UK effluent discharge strategy

    Large space structures and systems in the space station era: A bibliography with indexes (supplement 04)

    Get PDF
    Bibliographies and abstracts are listed for 1211 reports, articles, and other documents introduced into the NASA scientific and technical information system between 1 Jul. and 30 Dec. 1991. Its purpose is to provide helpful information to the researcher, manager, and designer in technology development and mission design according to system, interactive analysis and design, structural concepts and control systems, electronics, advanced materials, assembly concepts, propulsion, and solar power satellite systems

    Mining Safety and Sustainability I

    Get PDF
    Safety and sustainability are becoming ever bigger challenges for the mining industry with the increasing depth of mining. It is of great significance to reduce the disaster risk of mining accidents, enhance the safety of mining operations, and improve the efficiency and sustainability of development of mineral resource. This book provides a platform to present new research and recent advances in the safety and sustainability of mining. More specifically, Mining Safety and Sustainability presents recent theoretical and experimental studies with a focus on safety mining, green mining, intelligent mining and mines, sustainable development, risk management of mines, ecological restoration of mines, mining methods and technologies, and damage monitoring and prediction. It will be further helpful to provide theoretical support and technical support for guiding the normative, green, safe, and sustainable development of the mining industry

    ESSE 2017. Proceedings of the International Conference on Environmental Science and Sustainable Energy

    Get PDF
    Environmental science is an interdisciplinary academic field that integrates physical-, biological-, and information sciences to study and solve environmental problems. ESSE - The International Conference on Environmental Science and Sustainable Energy provides a platform for experts, professionals, and researchers to share updated information and stimulate the communication with each other. In 2017 it was held in Suzhou, China June 23-25, 2017
    • …
    corecore