1,241 research outputs found

    A multi-cycled sequential memetic computing approach for constrained optimisation

    Get PDF
    In this paper, we propose a multi-cycled sequential memetic computing structure for constrained optimisation. The structure is composed of multiple evolutionary cycles. At each cycle, an evolutionary algorithm is considered as an operator, and connects with a local optimiser. This structure enables the learning of useful knowledge from previous cycles and the transfer of the knowledge to facilitate search in latter cycles. Specifically, we propose to apply an estimation of distribution algorithm (EDA) to explore the search space until convergence at each cycle. A local optimiser, called DONLP2, is then applied to improve the best solution found by the EDA. New cycle starts after the local improvement if the computation budget has not been exceeded. In the developed EDA, an adaptive fully-factorized multivariate probability model is proposed. A learning mechanism, implemented as the guided mutation operator, is adopted to learn useful knowledge from previous cycles. The developed algorithm was experimentally studied on the benchmark problems in the CEC 2006 and 2010 competition. Experimental studies have shown that the developed probability model exhibits excellent exploration capability and the learning mechanism can significantly improve the search efficiency under certain conditions. The comparison against some well-known algorithms showed the superiority of the developed algorithm in terms of the consumed fitness evaluations and the solution quality

    The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

    Get PDF
    open access articleWe present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them

    A MOS-based Dynamic Memetic Differential Evolution Algorithm for Continuous Optimization: A Scalability Test

    Get PDF
    Continuous optimization is one of the areas with more activity in the field of heuristic optimization. Many algorithms have been proposed and compared on several benchmarks of functions, with different performance depending on the problems. For this reason, the combination of different search strategies seems desirable to obtain the best performance of each of these approaches. This contribution explores the use of a hybrid memetic algorithm based on the multiple offspring framework. The proposed algorithm combines the explorative/exploitative strength of two heuristic search methods that separately obtain very competitive results. This algorithm has been tested with the benchmark problems and conditions defined for the special issue of the Soft Computing Journal on Scalability of Evolutionary Algorithms and other Metaheuristics for Large Scale Continuous Optimization Problems. The proposed algorithm obtained the best results compared with both its composing algorithms and a set of reference algorithms that were proposed for the special issue
    • …
    corecore