1,738 research outputs found

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF
    This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio

    Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models

    Get PDF

    Statistical aspects of simulation:An updated survey

    Get PDF

    The behavior of transient period of nonterminating simulations: an experimental analysis

    Get PDF
    Cataloged from PDF version of article.The design and control of many industrial and service systems require the analysts to account for uncertainty. Computer simulation is a frequently used technique for analyzing uncertain (or stochastic) systems. One disadvantage of simulation modeling is that simulation results are only estimates of model performance measures. Therefore, to obtain better estimates, the outputs of a simulation run should undergo a careful statistical analysis. Simulation studies can be classified as terminating and nonterminating according to the output analysis techniques used. One of the major problems in the output analysis of nonterminating simulations is the problem of initial transient. This problem arises due to initializing simulation runs in an unrepresentative state of the steady-state conditions. Many techniques have been proposed in the literature to deal with the problem of initial transient. However, existing studies try to improve the efficiency and effectiveness of currently proposed techniques. No research has been encountered that analyzes the behavior of the transient period. In this thesis, we investigate the factors affecting the length of the transient period for nonterminating manufacturing simulations, particularly for serial production lines and job-shop production systems. Factors such as variability of processing times, system size, existence of bottleneck, reliability of system, system load level, and buffer capacity are investigated.Sandıkçı, BurhaneddinM.S

    Geometry of the ergodic quotient reveals coherent structures in flows

    Full text link
    Dynamical systems that exhibit diverse behaviors can rarely be completely understood using a single approach. However, by identifying coherent structures in their state spaces, i.e., regions of uniform and simpler behavior, we could hope to study each of the structures separately and then form the understanding of the system as a whole. The method we present in this paper uses trajectory averages of scalar functions on the state space to: (a) identify invariant sets in the state space, (b) form coherent structures by aggregating invariant sets that are similar across multiple spatial scales. First, we construct the ergodic quotient, the object obtained by mapping trajectories to the space of trajectory averages of a function basis on the state space. Second, we endow the ergodic quotient with a metric structure that successfully captures how similar the invariant sets are in the state space. Finally, we parametrize the ergodic quotient using intrinsic diffusion modes on it. By segmenting the ergodic quotient based on the diffusion modes, we extract coherent features in the state space of the dynamical system. The algorithm is validated by analyzing the Arnold-Beltrami-Childress flow, which was the test-bed for alternative approaches: the Ulam's approximation of the transfer operator and the computation of Lagrangian Coherent Structures. Furthermore, we explain how the method extends the Poincar\'e map analysis for periodic flows. As a demonstration, we apply the method to a periodically-driven three-dimensional Hill's vortex flow, discovering unknown coherent structures in its state space. In the end, we discuss differences between the ergodic quotient and alternatives, propose a generalization to analysis of (quasi-)periodic structures, and lay out future research directions.Comment: Submitted to Elsevier Physica D: Nonlinear Phenomen

    Kriging metamodeling for simulation

    Get PDF
    Many scientific disciplines use mathematical models to describe complicated real systems. Often, analytical methods are inadequate, so simulation is applied. This thesis focuses on computer intensive simulation experiments in Operations Research/Management Science. For such experiments it is necessary to apply interpolation. In this thesis, Kriging interpolation for random simulation is proposed and a novel type of Kriging - called Detrended Kriging - is developed. Kriging turns out to give better predictions in random simulation than classic low-order polynomial regression. Kriging is not sensitive to variance heterogeneity: i.e. Kriging is a robust method. Moreover, the thesis develops a novel method to select experimental designs for expensive simulation. This method is sequential, and accounts for the specific input/output function implied by the underlying simulation model. For deterministic simulation the designs are constructed through cross-validation and jackknifing, whereas for random simulation the customization is achieved through bootstrapping. The novel method simulates relatively more input combinations in the interesting parts of the input/output function, and gives better predictions than traditional Latin Hypercube Sample designs with prefixed sample sizes.
    • …
    corecore