4 research outputs found

    Modified Selection Mechanisms Designed to Help Evolution Strategies Cope with Noisy Response Surfaces

    Get PDF
    With the rise in the application of evolution strategies for simulation optimization, a better understanding of how these algorithms are affected by the stochastic output produced by simulation models is needed. At very high levels of stochastic variance in the output, evolution strategies in their standard form experience difficulty locating the optimum. The degradation of the performance of evolution strategies in the presence of very high levels of variation can be attributed to the decrease in the proportion of correctly selected solutions as parents from which offspring solutions are generated. The proportion of solutions correctly selected as parents can be increased by conducting additional replications for each solution. However, experimental evaluation suggests that a very high proportion of correctly selected solutions as parents is not required. A proportion of correctly selected solutions of around 0.75 seems sufficient for evolution strategies to perform adequately. Integrating statistical techniques into the algorithm?s selection process does help evolution strategies cope with high levels of noise. There are four categories of techniques: statistical ranking and selection techniques, multiple comparison procedures, clustering techniques, and other techniques. Experimental comparison of indifference zone selection procedure by Dudewicz and Dalal (1975), sequential procedure by Kim and Nelson (2001), Tukey?s Procedure, clustering procedure by Calsinki and Corsten (1985), and Scheffe?s procedure (1985) under similar conditions suggests that the sequential ranking and selection procedure by Kim and Nelson (2001) helps evolution strategies cope with noise using the smallest number of replications. However, all of the techniques required a rather large number of replications, which suggests that better methods are needed. Experimental results also indicate that a statistical procedure is especially required during the later generations when solutions are spaced closely together in the search space (response surface)

    Optimization under uncertainty with application to data clustering

    Get PDF
    A new optimization technique with uncertainty that extends the pure nested partition (NP) algorithm is presented in this thesis. This method is called the nested partition with inheritance. The basic idea of a NP algorithm is very simple. At each iteration, the most promising region is partitioned and the performance of the partitioned region is evaluated using sampling. Based on the performance evaluation, the most promising region is chosen for the next iteration. These procedures are repeated until it satisfies the termination condition.;Even though the pure NP method guarantees the convergence to the optimal solution, it has several shortcomings. To handle these shortcomings, two extensions to the pure NP are suggested. To rigorously determine the required sample effort, some statistical selection methods are implemented, which include the Nelson Matejcik procedure, the Rinott procedure, and the Dudewicz and Dalal procedure, as well as a subset procedure. In addition, Genetic Algorithms (GAs) are used to speed convergence and to overcome the difficulty in the backtracking stage of the NP algorithm.;As an application of the new methodology, this work also suggests the methods to be applied to a data clustering problem. This is a very hard problem with two of the main difficulties being lack of scalability with respect to amount of data and problems with high dimensionality. The new algorithms are found to be effective for solving this problem. Random sampling enhances scalability and the iterative partitioning addresses the dimensionality

    Pattern Search Ranking and Selection Algorithms for Mixed-Variable Optimization of Stochastic Systems

    Get PDF
    A new class of algorithms is introduced and analyzed for bound and linearly constrained optimization problems with stochastic objective functions and a mixture of design variable types. The generalized pattern search (GPS) class of algorithms is extended to a new problem setting in which objective function evaluations require sampling from a model of a stochastic system. The approach combines GPS with ranking and selection (R&S) statistical procedures to select new iterates. The derivative-free algorithms require only black-box simulation responses and are applicable over domains with mixed variables (continuous, discrete numeric, and discrete categorical) to include bound and linear constraints on the continuous variables. A convergence analysis for the general class of algorithms establishes almost sure convergence of an iteration subsequence to stationary points appropriately defined in the mixed-variable domain. Additionally, specific algorithm instances are implemented that provide computational enhancements to the basic algorithm. Implementation alternatives include the use modern R&S procedures designed to provide efficient sampling strategies and the use of surrogate functions that augment the search by approximating the unknown objective function with nonparametric response surfaces. In a computational evaluation, six variants of the algorithm are tested along with four competing methods on 26 standardized test problems. The numerical results validate the use of advanced implementations as a means to improve algorithm performance

    Tabu search with fully sequential procedure for simulation optimization

    Get PDF
    Cataloged from PDF version of article.Simulation is a descriptive technique that is used to understand the behaviour of both conceptual and real systems. Most of the real life systems are dynamic and stochastic that it may be very difficult to derive analytical representation. Simulation can be used to model and to analyze these systems. Although simulation provides insightful information about the system behaviour, it cannot be used to optimize the system performance. With the development of the metaheuristics, the concept simulation optimization has became a reality in recent years. A simulation optimization technique uses simulation as an evaluator, and tries to optimize the systems performance by setting appropriate values of simulation input. On the other hand, statistical ranking and selection procedures are used to find the best system design among a set of alternatives with a desired confidence level. In this study, we combine these two methodologies and investigate the performance of the hybrid procedure. Tabu Search (TS) heuristic is combined with the Fully Sequential Procedure (FSP) in simulation optimization context. The performance of the combined procedure is examined in four different systems. The effectiveness of the FSP is assessed considering the computational effort and the convergence to the best (near optimal) solution.Çevik, SavaşM.S
    corecore