3,158 research outputs found

    Testing the Assumptions of Sequential Bifurcation for Factor Screening (revision of CentER DP 2015-034)

    Get PDF
    Sequential bifurcation (or SB) is an efficient and effective factor-screening method; i.e., SB quickly identifies the important factors (inputs) in experiments with simulation models that have very many factors—provided the SB assumptions are valid. The specific SB assumptions are: (i) a secondorder polynomial is an adequate approximation (a valid metamodel) of the implicit input/output function of the underlying simulation model; (ii) the directions (signs) of the first-order effects are known (so the first-order polynomial approximation is monotonic); (iii) so-called “heredity” applies; i.e., if an input has no important first-order effect, then this input has no important second-order effects. Moreover—like many other statistical methods—SB assumes Gaussian simulation outputs if the simulation model is stochastic (random). A generalization of SB called “multiresponse SB” (or MSB) uses the same assumptions, but allows for simulation models with multiple types of responses (outputs). To test whether these assumptions hold, we develop new methods. We evaluate these methods through Monte Carlo experiments and a case study

    Finding the Important Factors in Large Discrete-Event Simulation: Sequential Bifurcation and its Applications

    Get PDF
    This contribution discusses experiments with many factors: the case study includes a simulation model with 92 factors.The experiments are guided by sequential bifurcation.This method is most efficient and effective if the true input/output behavior of the simulation model can be approximated through a first-order polynomial possibly augmented with two-factor interactions.The method is explained and illustrated through three related discrete-event simulation models.These models represent three supply chain configurations, studied for an Ericsson factory in Sweden.After simulating 21 scenarios (factor combinations) each replicated five times to account for noise a shortlist with the 11 most important factors is identified for the biggest of the three simulation models.simulation;bifurcation;supply;Sweden

    Sensitivity Analysis of Simulation Models

    Get PDF
    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial metamodels, resolution-IV and resolution-V designs for metamodels augmented with two-factor interactions, and designs for second-degree polynomial metamodels including central composite designs. It also reviews factor screening for simulation models with very many factors, focusing on the so-called "sequential bifurcation" method. Furthermore, it reviews Kriging metamodels and their designs. It mentions that sensitivity analysis may also aim at the optimization of the simulated system, allowing multiple random simulation outputs.simulation;sensitivity analysis;gradients;screening;Kriging;optimization;Response SurfaceMethodology;Taguchi

    Screening Experiments for Simulation: A Review

    Get PDF
    This article reviews so-called screening in simulation; i.e., it examines the search for the really important factors in experiments with simulation models that have very many factors (or inputs). The article focuses on a most efficient and effec- tive screening method, namely Sequential Bifurcation. It ends with a discussion of possible topics for future research, and forty references for further study.Screening;Metamodel;Response Surface;Design

    Identifying the important factors in simulation models with many factors

    Get PDF
    Simulation models may have many parameters and input variables (together called factors), while only a few factors are really important (parsimony principle). For such models this paper presents an effective and efficient screening technique to identify and estimate those important factors. The technique extends the classical binary search technique to situations with more than a single important factor. The technique uses a low-order polynomial approximation to the input/output behavior of the simulation model. This approximation may account for interactions among factors. The technique is demonstrated by applying it to a complicated ecological simulation that models the increase of temperatures worldwide.Simulation Models;econometrics

    Design of Experiments: An Overview

    Get PDF
    Design Of Experiments (DOE) is needed for experiments with real-life systems, and with either deterministic or random simulation models. This contribution discusses the different types of DOE for these three domains, but focusses on random simulation. DOE may have two goals: sensitivity analysis including factor screening and optimization. This contribution starts with classic DOE including 2k-p and Central Composite designs. Next, it discusses factor screening through Sequential Bifurcation. Then it discusses Kriging including Latin Hyper cube Sampling and sequential designs. It ends with optimization through Generalized Response Surface Methodology and Kriging combined with Mathematical Programming, including Taguchian robust optimization.simulation;sensitivity analysis;optimization;factor screening;Kriging;RSM;Taguchi

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio

    Sensitivity analysis and related analysis: A survey of statistical techniques

    Get PDF
    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical techniques may then be used? This paper distinguishes the following five stages in the analysis of a simulation model. 1) Validation: the availability of data on the real system determines which type of statistical technique to use for validation. 2) Screening: in the simulation's pilot phase the really important inputs can be identified through a novel technique, called sequential bifurcation, which uses aggregation and sequential experimentation. 3) Sensitivity analysis: the really important inputs should be This approach with its five stages implies that sensitivity analysis should precede uncertainty analysis. This paper briefly discusses several case studies for each phase.Experimental Design;Statistical Methods;Regression Analysis;Risk Analysis;Least Squares;Sensitivity Analysis;Optimization;Perturbation;statistics
    corecore