154,372 research outputs found

    Development of a New Monte Carlo Code for High-Fidelity Power Reactor Analysis

    Get PDF
    Department of Nuclear EngineeringThe high-fidelity multiphysics simulation using transport codes are being mainstream in the reactor physics society. Monte Carlo neutron transport code is one of the most promising candidates of neutronics code of multiphysics simulation since it has advantages of using continuous energy cross section and explicit geometry modeling. Most of the methods required for the Monte Carlo multiphysics simulation has been developed and studied well individually. However, Monte Carlo method have not been able to be applied for large-scale multiphysics simulation such as Pressurized Water Reactor analysis because of the limited computing power, memory storage and especially lack of Monte Carlo codes adapted for large-scale power reactor simulation. Development of Monte Carlo multiphysics code is a challenging due to two aspect: implementing various state of the art techniques into one single code system and making it feasible running simulations on practical computing machines. A new Monte Carlo multiphysics code named MCS was developed for large-scale power reactor analysis. Various state-of-the art techniques were implemented to make it practical tool for multiphysics simulation including thermal hydraulics, depletion, equilibrium xenon, eigenvalue search, on-the-fly cross section generation, hash-indexing, parallel fission bank. The high performance of MCS was achieved and demonstrated. The test result confirmed that the overhead of massive number of tallies is only a one percent up to 13M tally bins, and the parallel efficiency was maintained above 90% up to 1,120 processors when solving power reactor simulation. The fundamental study of power reactor analysis was performed to decide calculation condition including burnup sensitivity, mesh sensitivity, history sensitivity against pressurized water reactor benchmark problem BEAVRS Cycle 1. Finally, capability of power reactor analysis was demonstrated against BEAVRS Cycle 1 and 2.ope

    Risk Analysis and Reliability Improvement of Mechanistic-Empirical Pavement Design

    Get PDF
    Reliability used in the Mechanistic Empirical Pavement Design Guide (MEPDG) is a congregated indicator defined as the probability that each of the key distress types and smoothness will be less than a selected critical level over the design period. For such a complex system as the MEPDG which does not have closed-form design equations, classic reliability methods are not applicable. A robust reliability analysis can rely on Monte Carlo Simulation (MCS). The ultimate goal of this study was to improve the reliability model of the MEPDG using surrogate modeling techniques and Monte Carlo simulation. To achieve this goal, four tasks were accomplished in this research. First, local calibration using 38 pavement sections was completed to reduce the system bias and dispersion of the nationally calibrated MEPDG. Second, uncertainty and risk in the MEPDG were identified using Hierarchical Holographic Modeling (HHM). To determine the critical factors affecting pavement performance, this study applied not only the traditional sensitivity analysis method but also the risk assessment method using the Analytic Hierarchy Process (AHP). Third, response surface models were built to provide a rapid solution of distress prediction for alligator cracking, rutting and smoothness. Fourth, a new reliability model based on Monte Carlo Simulation was proposed. Using surrogate models, 10,000 Monte Carlo simulations were calculated in minutes to develop the output ensemble, on which the predicted distresses at any reliability level were readily available. The method including all data and algorithms was packed in a user friendly software tool named ReliME. Comparison between the AASHTO 1993 Guide, the MEPDG and ReliME was presented in three case studies. It was found that the smoothness model in MEPDG had an extremely high level of variation. The product from this study was a consistent reliability model specific to local conditions, construction practices and specifications. This framework also presented the feasibility of adopting Monte Carlo Simulation for reliability analysis in future mechanistic empirical pavement design software

    A Robust Study of Regression Methods for Crop Yield Data

    Get PDF
    The objective of this study is to evaluate the robust regression method when detrending the crop yield data. Using a Monte Carlo simulation method, the performance of the proposed Time-Varying Beta method is compared with the previous study of OLS, M-estimator and MM-estimator in an application of crop yield modeling. We analyze the properties of these estimators for outlier-contaminated data in both symmetric and skewed distribution case. The application of these estimation methods is illustrated in an agricultural insurance analysis. The consequence of obtaining more accurate detrending method will offer the potential to improve the accuracy of models used in rating crop insurance contracts.Research Methods/ Statistical Methods, Risk and Uncertainty,

    Flexible shrinkage in high-dimensional Bayesian spatial autoregressive models

    Get PDF
    This article introduces two absolutely continuous global-local shrinkage priors to enable stochastic variable selection in the context of high-dimensional matrix exponential spatial specifications. Existing approaches as a means to dealing with overparameterization problems in spatial autoregressive specifications typically rely on computationally demanding Bayesian model-averaging techniques. The proposed shrinkage priors can be implemented using Markov chain Monte Carlo methods in a flexible and efficient way. A simulation study is conducted to evaluate the performance of each of the shrinkage priors. Results suggest that they perform particularly well in high-dimensional environments, especially when the number of parameters to estimate exceeds the number of observations. For an empirical illustration we use pan-European regional economic growth data.Comment: Keywords: Matrix exponential spatial specification, model selection, shrinkage priors, hierarchical modeling; JEL: C11, C21, C5

    Sensor Analysis, Modeling, and Test for Robust Propulsion System Autonomy

    Get PDF
    An approach is presented supporting analysis, modeling, and test validation of operational flight instrumentation (OFI) that facilitates critical functions for the Space Launch System (SLS) main propulsion system (MPS). Certain types of OFI sensors were shown to exhibit highly nonlinear and non-gaussian noise characteristics during acceptance testing, motivating the development of advanced modeling and simulation (M&S) capability to support algorithm verification and flight certification. Hardware model and algorithm simulation fidelity was informed by a risk scoring metric; redesign of high-risk algorithms using test-validated sensor models significantly improved their expected performance as evaluated using Monte Carlo acceptance sampling methods. Autonomous functions include closed-loop ullage pressure regulation, pressurant leak detection, and fault isolation for automated safing and crew caution and warning (C&W)

    Bayesian Subset Simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure

    Full text link
    The estimation of small probabilities of failure from computer simulations is a classical problem in engineering, and the Subset Simulation algorithm proposed by Au & Beck (Prob. Eng. Mech., 2001) has become one of the most popular method to solve it. Subset simulation has been shown to provide significant savings in the number of simulations to achieve a given accuracy of estimation, with respect to many other Monte Carlo approaches. The number of simulations remains still quite high however, and this method can be impractical for applications where an expensive-to-evaluate computer model is involved. We propose a new algorithm, called Bayesian Subset Simulation, that takes the best from the Subset Simulation algorithm and from sequential Bayesian methods based on kriging (also known as Gaussian process modeling). The performance of this new algorithm is illustrated using a test case from the literature. We are able to report promising results. In addition, we provide a numerical study of the statistical properties of the estimator.Comment: 11th International Probabilistic Assessment and Management Conference (PSAM11) and The Annual European Safety and Reliability Conference (ESREL 2012), Helsinki : Finland (2012
    corecore