117 research outputs found

    Dynamic Bayesian Networks as a Probabilistic Metamodel for Combat Simulations

    Get PDF
    Simulation modeling is used in many situations. Simulation meta-modeling is used to estimate a simulation model result by representing the space of simulation model responses. Metamodeling methods are particularly useful when the simulation model is not particularly suited to real-time or mean real-time use. Most metamodeling methods provide expected value responses while some situations need probabilistic responses. This research establishes the viability of Dynamic Bayesian Networks for simulation metamodeling, those situations needing probabilistic responses. A bootstrapping method is introduced to reduce simulation data requirement for a DBN, and experimental design is shown to benefit a DBN used to represent a multi-dimensional response space. An improved interpolation method is developed and shown beneficial to DBN metamodeling applications. These contributions are employed in a military modeling case study to fully demonstrate the viability of DBN metamodeling for Defense analysis application

    Simulation Modeling to Optimize Personalized Oncology

    Get PDF

    Simulation Optimization through Regression or Kriging Metamodels

    Get PDF

    Primary Healthcare Delivery Network Simulation using Stochastic Metamodels

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this recordA discrete-event simulation (DES) of the network of primary health centers (PHCs) in a region can be used to evaluate the effect of changes in patient flow on operational outcomes across the network, and can also form the base simulation to which simulations of secondary and tertiary care facilities can be added. We present a DES of a network of PHCs using stochastic metamodels developed from more detailed DES models of PHCs (‘parent’ simulations), which were developed separately for comprehensively analyzing individual PHC operations. The stochastic metamodels are DESs in their own right. They are simplified versions of the parent simulation with full-featured representations of only those components relevant to the analysis at hand. We show that the outputs of interest from the metamodels and the parent simulations (including the network simulations) are statistically similar and that our metamodel-based network simulation yields reductions of up to 80% in runtimes

    A comparison of eight metamodeling techniques for the simulation of N2O fluxes and N leaching from corn crops

    Get PDF
    International audienceThe environmental costs of intensive farming activities are often under-estimated or not traded by the market, even though they play an important role in addressing future society's needs. The estimation of nitrogen (N) dynamics is thus an important issue which demands detailed simulation based methods and their integrated use to correctly represent complex and non-linear interactions into cropping systems. To calculate the N2O flux and N leaching from European arable lands, a modeling framework has been developed by linking the CAPRI agro-economic dataset with the DNDC-EUROPE bio-geo-chemical model. But, despite the great power of modern calculators, their use at continental scale is often too computationally costly. By comparing several statistical methods this paper aims to design a metamodel able to approximate the expensive code of the detailed modeling approach, devising the best compromise between estimation performance and simulation speed. We describe the use of two parametric (linear) models and six non-parametric approaches: two methods based on splines (ACOSSO and SDR), one method based on kriging (DACE), a neural networks method (multilayer perceptron, MLP), SVM and a bagging method (random forest, RF). This analysis shows that, as long as few data are available to train the model, splines approaches lead to best results, while when the size of training dataset increases, SVM and RF provide faster and more accurate solutions

    Characterization and valuation of uncertainty of calibrated parameters in stochastic decision models

    Full text link
    We evaluated the implications of different approaches to characterize uncertainty of calibrated parameters of stochastic decision models (DMs) in the quantified value of such uncertainty in decision making. We used a microsimulation DM of colorectal cancer (CRC) screening to conduct a cost-effectiveness analysis (CEA) of a 10-year colonoscopy screening. We calibrated the natural history model of CRC to epidemiological data with different degrees of uncertainty and obtained the joint posterior distribution of the parameters using a Bayesian approach. We conducted a probabilistic sensitivity analysis (PSA) on all the model parameters with different characterizations of uncertainty of the calibrated parameters and estimated the value of uncertainty of the different characterizations with a value of information analysis. All analyses were conducted using high performance computing resources running the Extreme-scale Model Exploration with Swift (EMEWS) framework. The posterior distribution had high correlation among some parameters. The parameters of the Weibull hazard function for the age of onset of adenomas had the highest posterior correlation of -0.958. Considering full posterior distributions and the maximum-a-posteriori estimate of the calibrated parameters, there is little difference on the spread of the distribution of the CEA outcomes with a similar expected value of perfect information (EVPI) of \$653 and \$685, respectively, at a WTP of \$66,000/QALY. Ignoring correlation on the posterior distribution of the calibrated parameters, produced the widest distribution of CEA outcomes and the highest EVPI of \$809 at the same WTP. Different characterizations of uncertainty of calibrated parameters have implications on the expect value of reducing uncertainty on the CEA. Ignoring inherent correlation among calibrated parameters on a PSA overestimates the value of uncertainty.Comment: 17 pages, 6 figures, 3 table

    Uncertainty and sensitivity analysis for long-running computer codes : a critical review

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering, 2010."February 2010." Cataloged from PDF version of thesis.Includes bibliographical references (p. 137-146).This thesis presents a critical review of existing methods for performing probabilistic uncertainty and sensitivity analysis for complex, computationally expensive simulation models. Uncertainty analysis (UA) methods reviewed include standard Monte Carlo simulation, Latin Hypercube sampling, importance sampling, line sampling, and subset simulation. Sensitivity analysis (SA) methods include scatter plots, Monte Carlo filtering, regression analysis, variance-based methods (Sobol' sensitivity indices and Sobol' Monte Carlo algorithms), and Fourier amplitude sensitivity tests. In addition, this thesis reviews several existing metamodeling techniques that are intended provide quick-running approximations to the computer models being studied. Because stochastic simulation-based UA and SA rely on a large number (e.g., several thousands) of simulations, metamodels are recognized as a necessary compromise when UA and SA must be performed with long-running (i.e., several hours or days per simulation) computational models. This thesis discusses the use of polynomial Response Surfaces (RS), Artificial Neural Networks (ANN), and Kriging/Gaussian Processes (GP) for metamodeling. Moreover, two methods are discussed for estimating the uncertainty introduced by the metamodel. The first of these methods is based on a bootstrap sampling procedure, and can be utilized for any metamodeling technique.(cont.) The second method is specific to GP models, and is based on a Bayesian interpretation of the underlying stochastic process. Finally, to demonstrate the use of these methods, the results from two case studies involving the reliability assessment of passive nuclear safety systems are presented. The general conclusions of this work are that polynomial RSs are frequently incapable of adequately representing the complex input/output behavior exhibited by many mechanistic models. In addition, the goodness-of- fit of the RS should not be misinterpreted as a measure of the predictive capability of the metamodel, since RSs are necessarily biased predictors for deterministic computer models. Furthermore, the extent of this bias is not measured by standard goodness-of-fit metrics (e.g., coefficient of determination, R 2), so these methods tend to provide overly optimistic indications of the quality of the metamodel. The bootstrap procedure does provide indication as to the extent of this bias, with the bootstrap confidence intervals for the RS estimates generally being significantly wider than those of the alternative metamodeling methods. It has been found that the added flexibility afforded by ANNs and GPs can make these methods superior for approximating complex models. In addition, GPs are exact interpolators, which is an important feature when the underlying computer model is deterministic (i.e., when there is no justification for including a random error component in the metamodel). On the other hand, when the number of observations from the computer model is sufficiently large, all three methods appear to perform comparably, indicating that in such cases, RSs can still provide useful approximations.by Dustin R. Langewisch.S.M

    Design and analysis of computer experiments for stochastic systems

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Framework for emulation and uncertainty quantification of a stochastic building performance simulator

    Get PDF
    A good framework for the quantification and decomposition of uncertainties in dynamic building performance simulation should: (i) simulate the principle deterministic processes influencing heat flows and the stochastic perturbations to them, (ii) quantify and decompose the total uncertainty into its respective sources, and the interactions between them, and (iii) achieve this in a computationally efficient manner. In this paper we introduce a new framework which, for the first time, does just that. We present the detailed development of this framework for emulating the mean and the variance in the response of a stochastic building performance simulator (EnergyPlus co-simulated with a multi agent stochastic simulator called No-MASS), for heating and cooling load predictions. We demonstrate and evaluate the effectiveness of these emulators, applied to a monozone office building. With a range of 25–50 kWh/m2, the epistemic uncertainty due to envelope parameters dominates over aleatory uncertainty relating to occupants' interactions, which ranges from 6–8 kWh/m2, for heating loads. The converse is observed for cooling loads, which vary by just 3 kWh/m2 for envelope parameters, compared with 8–22 kWh/m2 for their aleatory counterparts. This is due to the larger stimuli provoking occupants' interactions. Sensitivity indices corroborate this result, with wall insulation thickness (0.97) and occupants' behaviours (0.83) having the highest impacts on heating and cooling load predictions respectively. This new emulator framework (including training and subsequent deployment) achieves a factor of c.30 reduction in the total computational budget, whilst overwhelmingly maintaining predictions within a 95% confidence interval, and successfully decomposing prediction uncertainties

    SIMULATION-BASED OPTIMIZATION OF TRANSPORTATION SYSTEMS: THEORY, SURROGATE MODELS, AND APPLICATIONS

    Get PDF
    The construction of new highway infrastructure has not kept pace with the growth of travel, mainly due to the limitation of land and funding availability. To improve the mobility, safety, reliability and sustainability of the transportation system, various transportation planning and traffic operations policies have been developed in the past few decades. On the other hand, simulation is widely used to evaluate the impacts of those policies, due to its advantages in capturing network and behavior details and capability of analyzing various combinations of policies. A simulation-based optimization (SBO) method, which combines the strength of simulation evaluation and mathematical optimization, is imperative for supporting decision making in practice. The objective of this dissertation is to develop SBO methods that can be efficiently applied to transportation planning and operations problems. Surrogate-based methods are selected as the research focus after reviewing various existing SBO methods. A systematic framework for applying the surrogate-based optimization methods in transportation research is then developed. The performance of different forms of surrogate models is compared through a numerical example, and regressing Kriging is identified as the best model in approximating the unknown response surface when no information regarding the simulation noise is available. Accompanied with an expected improvement global infill strategy, regressing Kriging is successfully applied in a real world application of optimizing the dynamic pricing for a toll road in the Inter-County Connector (ICC) regional network in the State of Maryland. To further explore its capability in dealing with problems that are of more interest to planners and operators of the transportation system, this method is then extended to solve constrained and multi-objective optimization problems. Due to the observation of heteroscedasticity in transportation simulation outputs, two surrogate models that can be adapted for heteroscedastic data are developed: a heteroscedastic support vector regression (SVR) model and a Bayesian stochastic Kriging model. These two models deal with the heteroscedasticity in simulation noise in different ways, and their superiority in approximating the response surface of simulations with heteroscedastic noise over regressing Kriging is verified through both numerical studies and real world applications. Furthermore, a distribution-based SVR model which takes into account the statistical distribution of simulation noise is developed. By utilizing the bootstrapping method, a global search scheme can be incorporated into this model. The value of taking into account the statistical distribution of simulation noise in improving the convergence rate for optimization is then verified through numerical examples and a real world application of integrated corridor traffic management. This research is one of the first to introduce simulation-based optimization methods into large-scale transportation network research. Various types of practical problems (with single-objective, with multi-objective or with complex constraints) can be resolved. Meanwhile, the developed optimization methods are general and can be applied to analyze all types of policies using any simulator. Methodological improvements to the surrogate models are made to take into account the statistical characteristics of simulation noise. These improvements are shown to enhance the prediction accuracy of the surrogate models, and further enhance the efficiency of optimization. Generally, compared to traditional surrogate models, fewer simulation evaluations would be needed to find the optimal solution when these improved models are applied
    corecore