211 research outputs found
The role of statistical methodology in simulation
statistical methods;simulation;operations research
Experimental Design for Sensitivity Analysis, Optimization and Validation of Simulation Models
This chapter gives a survey on the use of statistical designs for what-if analysis in simula- tion, including sensitivity analysis, optimization, and validation/verification. Sensitivity analysis is divided into two phases. The first phase is a pilot stage, which consists of screening or searching for the important factors among (say) hundreds of potentially important factors. A novel screening technique is presented, namely sequential bifurcation. The second phase uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as a metamodel or a response surface. Regression analysis gives better results when the simu- lation experiment is well designed, using either classical statistical designs (such as frac- tional factorials) or optimal designs (such as pioneered by Fedorov, Kiefer, and Wolfo- witz). To optimize the simulated system, the analysts may apply Response Surface Metho- dology (RSM); RSM combines regression analysis, statistical designs, and steepest-ascent hill-climbing. To validate a simulation model, again regression analysis and statistical designs may be applied. Several numerical examples and case-studies illustrate how statisti- cal techniques can reduce the ad hoc character of simulation; that is, these statistical techniques can make simulation studies give more general results, in less time. Appendix 1 summarizes confidence intervals for expected values, proportions, and quantiles, in termi- nating and steady-state simulations. Appendix 2 gives details on four variance reduction techniques, namely common pseudorandom numbers, antithetic numbers, control variates or regression sampling, and importance sampling. Appendix 3 describes jackknifing, which may give robust confidence intervals.least squares;distribution-free;non-parametric;stopping rule;run-length;Von Neumann;median;seed;likelihood ratio
Efficient rare-event simulation for the maximum of heavy-tailed random walks
Let be a sequence of i.i.d. r.v.'s with negative mean. Set
and define . We propose an importance sampling
algorithm to estimate the tail of that is strongly
efficient for both light and heavy-tailed increment distributions. Moreover, in
the case of heavy-tailed increments and under additional technical assumptions,
our estimator can be shown to have asymptotically vanishing relative variance
in the sense that its coefficient of variation vanishes as the tail parameter
increases. A key feature of our algorithm is that it is state-dependent. In the
presence of light tails, our procedure leads to Siegmund's (1979) algorithm.
The rigorous analysis of efficiency requires new Lyapunov-type inequalities
that can be useful in the study of more general importance sampling algorithms.Comment: Published in at http://dx.doi.org/10.1214/07-AAP485 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Stochastic approximation of symmetric Nash equilibria in queueing games
We suggest a novel stochastic-approximation algorithm to compute a symmetric
Nash-equilibrium strategy in a general queueing game with a finite action
space. The algorithm involves a single simulation of the queueing process with
dynamic updating of the strategy at regeneration times. Under mild assumptions
on the utility function and on the regenerative structure of the queueing
process, the algorithm converges to a symmetric equilibrium strategy almost
surely. This yields a powerful tool that can be used to approximate equilibrium
strategies in a broad range of strategic queueing models in which direct
analysis is impracticable
- …