2 research outputs found

    Uncertainty And Evolutionary Optimization: A Novel Approach

    Full text link
    Evolutionary algorithms (EA) have been widely accepted as efficient solvers for complex real world optimization problems, including engineering optimization. However, real world optimization problems often involve uncertain environment including noisy and/or dynamic environments, which pose major challenges to EA-based optimization. The presence of noise interferes with the evaluation and the selection process of EA, and thus adversely affects its performance. In addition, as presence of noise poses challenges to the evaluation of the fitness function, it may need to be estimated instead of being evaluated. Several existing approaches attempt to address this problem, such as introduction of diversity (hyper mutation, random immigrants, special operators) or incorporation of memory of the past (diploidy, case based memory). However, these approaches fail to adequately address the problem. In this paper we propose a Distributed Population Switching Evolutionary Algorithm (DPSEA) method that addresses optimization of functions with noisy fitness using a distributed population switching architecture, to simulate a distributed self-adaptive memory of the solution space. Local regression is used in the pseudo-populations to estimate the fitness. Successful applications to benchmark test problems ascertain the proposed method's superior performance in terms of both robustness and accuracy.Comment: In Proceedings of the The 9th IEEE Conference on Industrial Electronics and Applications (ICIEA 2014), IEEE Press, pp. 988-983, 201

    Modified Selection Mechanisms Designed to Help Evolution Strategies Cope with Noisy Response Surfaces

    Get PDF
    With the rise in the application of evolution strategies for simulation optimization, a better understanding of how these algorithms are affected by the stochastic output produced by simulation models is needed. At very high levels of stochastic variance in the output, evolution strategies in their standard form experience difficulty locating the optimum. The degradation of the performance of evolution strategies in the presence of very high levels of variation can be attributed to the decrease in the proportion of correctly selected solutions as parents from which offspring solutions are generated. The proportion of solutions correctly selected as parents can be increased by conducting additional replications for each solution. However, experimental evaluation suggests that a very high proportion of correctly selected solutions as parents is not required. A proportion of correctly selected solutions of around 0.75 seems sufficient for evolution strategies to perform adequately. Integrating statistical techniques into the algorithm?s selection process does help evolution strategies cope with high levels of noise. There are four categories of techniques: statistical ranking and selection techniques, multiple comparison procedures, clustering techniques, and other techniques. Experimental comparison of indifference zone selection procedure by Dudewicz and Dalal (1975), sequential procedure by Kim and Nelson (2001), Tukey?s Procedure, clustering procedure by Calsinki and Corsten (1985), and Scheffe?s procedure (1985) under similar conditions suggests that the sequential ranking and selection procedure by Kim and Nelson (2001) helps evolution strategies cope with noise using the smallest number of replications. However, all of the techniques required a rather large number of replications, which suggests that better methods are needed. Experimental results also indicate that a statistical procedure is especially required during the later generations when solutions are spaced closely together in the search space (response surface)
    corecore