research

Benchmark generator for CEC 2009 competition on dynamic optimization

Abstract

Evolutionary algorithms(EAs) have been widely applied to solve stationary optimization problems. However, many real-world applications are actually dynamic. In order to study the performance of EAs in dynamic environments, one important task is to develop proper dynamic benchmark problems. Over the years, researchers have applied a number of dynamic test problems to compare the performance of EAs in dynamic environments, e.g., the “moving peaks ” benchmark (MPB) proposed by Branke [1], the DF1 generator introduced by Morrison and De Jong [6], the singleand multi-objective dynamic test problem generator by dynamically combining different objective functions of exiting stationary multi-objective benchmark problems suggested by Jin and Sendhoff [2], Yang and Yao’s exclusive-or (XOR) operator [10, 11, 12], Kang’s dynamic traveling salesman problem (DTSP) [3] and dynamic multi knapsack problem (DKP), etc. Though a number of DOP generators exist in the literature, there is no unified approach of constructing dynamic problems across the binary space, real space and combinatorial space so far. This report uses the generalized dynamic benchmark generator (GDBG) proposed in [4], which construct dynamic environments for all the three solution spaces. Especially, in the rea

    Similar works