41,374 research outputs found

    Differential evolution with two-level parameter adaptation

    Get PDF
    The performance of differential evolution (DE) largely depends on its mutation strategy and control parameters. In this paper, we propose an adaptive DE (ADE) algorithm with a new mutation strategy DE/lbest/1 and a two-level adaptive parameter control scheme. The DE/lbest/1 strategy is a variant of the greedy DE/best/1 strategy. However, the population is mutated under the guide of multiple locally best individuals in DE/lbest/1 instead of one globally best individual in DE/best/1. This strategy is beneficial to the balance between fast convergence and population diversity. The two-level adaptive parameter control scheme is implemented mainly in two steps. In the first step, the population-level parameters F p and CR p for the whole population are adaptively controlled according to the optimization states, namely, the exploration state and the exploitation state in each generation. These optimization states are estimated by measuring the population distribution. Then, the individual-level parameters F i and CR i for each individual are generated by adjusting the population-level parameters. The adjustment is based on considering the individual's fitness value and its distance from the globally best individual. This way, the parameters can be adapted to not only the overall state of the population but also the characteristics of different individuals. The performance of the proposed ADE is evaluated on a suite of benchmark functions. Experimental results show that ADE generally outperforms four state-of-the-art DE variants on different kinds of optimization problems. The effects of ADE components, parameter properties of ADE, search behavior of ADE, and parameter sensitivity of ADE are also studied. Finally, we investigate the capability of ADE for solving three real-world optimization problems

    SQG-Differential Evolution for difficult optimization problems under a tight function evaluation budget

    Full text link
    In the context of industrial engineering, it is important to integrate efficient computational optimization methods in the product development process. Some of the most challenging simulation-based engineering design optimization problems are characterized by: a large number of design variables, the absence of analytical gradients, highly non-linear objectives and a limited function evaluation budget. Although a huge variety of different optimization algorithms is available, the development and selection of efficient algorithms for problems with these industrial relevant characteristics, remains a challenge. In this communication, a hybrid variant of Differential Evolution (DE) is introduced which combines aspects of Stochastic Quasi-Gradient (SQG) methods within the framework of DE, in order to improve optimization efficiency on problems with the previously mentioned characteristics. The performance of the resulting derivative-free algorithm is compared with other state-of-the-art DE variants on 25 commonly used benchmark functions, under tight function evaluation budget constraints of 1000 evaluations. The experimental results indicate that the new algorithm performs excellent on the 'difficult' (high dimensional, multi-modal, inseparable) test functions. The operations used in the proposed mutation scheme, are computationally inexpensive, and can be easily implemented in existing differential evolution variants or other population-based optimization algorithms by a few lines of program code as an non-invasive optional setting. Besides the applicability of the presented algorithm by itself, the described concepts can serve as a useful and interesting addition to the algorithmic operators in the frameworks of heuristics and evolutionary optimization and computing

    Differential evolution with an evolution path: a DEEP evolutionary algorithm

    Get PDF
    Utilizing cumulative correlation information already existing in an evolutionary process, this paper proposes a predictive approach to the reproduction mechanism of new individuals for differential evolution (DE) algorithms. DE uses a distributed model (DM) to generate new individuals, which is relatively explorative, whilst evolution strategy (ES) uses a centralized model (CM) to generate offspring, which through adaptation retains a convergence momentum. This paper adopts a key feature in the CM of a covariance matrix adaptation ES, the cumulatively learned evolution path (EP), to formulate a new evolutionary algorithm (EA) framework, termed DEEP, standing for DE with an EP. Without mechanistically combining two CM and DM based algorithms together, the DEEP framework offers advantages of both a DM and a CM and hence substantially enhances performance. Under this architecture, a self-adaptation mechanism can be built inherently in a DEEP algorithm, easing the task of predetermining algorithm control parameters. Two DEEP variants are developed and illustrated in the paper. Experiments on the CEC'13 test suites and two practical problems demonstrate that the DEEP algorithms offer promising results, compared with the original DEs and other relevant state-of-the-art EAs

    A MOS-based Dynamic Memetic Differential Evolution Algorithm for Continuous Optimization: A Scalability Test

    Get PDF
    Continuous optimization is one of the areas with more activity in the field of heuristic optimization. Many algorithms have been proposed and compared on several benchmarks of functions, with different performance depending on the problems. For this reason, the combination of different search strategies seems desirable to obtain the best performance of each of these approaches. This contribution explores the use of a hybrid memetic algorithm based on the multiple offspring framework. The proposed algorithm combines the explorative/exploitative strength of two heuristic search methods that separately obtain very competitive results. This algorithm has been tested with the benchmark problems and conditions defined for the special issue of the Soft Computing Journal on Scalability of Evolutionary Algorithms and other Metaheuristics for Large Scale Continuous Optimization Problems. The proposed algorithm obtained the best results compared with both its composing algorithms and a set of reference algorithms that were proposed for the special issue
    • …
    corecore