6 research outputs found

    Benchmarking the Local Metamodel CMA-ES on the Noiseless BBOB'2013 Test Bed

    Get PDF
    International audienceThis paper evaluates the performance of a variant of the local-meta-model CMA-ES (lmm-CMA) in the BBOB 2013 expensive setting. The lmm-CMA is a surrogate variant of the CMA-ES algorithm. Function evaluations are saved by building, with weighted regression, full quadratic metamodels to estimate the candidate solutions' function values. The quality of the approximation is appraised by checking how much the predicted rank changes when evaluating a fraction of the candidate solutions on the original objective function. The results are compared with the CMA-ES without meta-modeling and with previously benchmarked algorithms, namely BFGS, NEWUOA and saACM. It turns out that the additional meta-modeling improves the performance of CMA-ES on almost all BBOB functions while giving significantly worse results only on the attractive sector function. Over all functions, the performance is comparable with saACM and the lmm-CMA often outperforms NEWUOA and BFGS starting from about 2D^2 function evaluations with D being the search space dimension

    Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy

    Get PDF
    This paper presents a novel mechanism to adapt surrogate-assisted population-based algorithms. This mechanism is applied to ACM-ES, a recently proposed surrogate-assisted variant of CMA-ES. The resulting algorithm, saACM-ES, adjusts online the lifelength of the current surrogate model (the number of CMA-ES generations before learning a new surrogate) and the surrogate hyper-parameters. Both heuristics significantly improve the quality of the surrogate model, yielding a significant speed-up of saACM-ES compared to the ACM-ES and CMA-ES baselines. The empirical validation of saACM-ES on the BBOB-2012 noiseless testbed demonstrates the efficiency and the scalability w.r.t the problem dimension and the population size of the proposed approach, that reaches new best results on some of the benchmark problems.Comment: Genetic and Evolutionary Computation Conference (GECCO 2012) (2012

    Towards Dynamic Algorithm Selection for Numerical Black-Box Optimization: Investigating BBOB as a Use Case

    Get PDF
    One of the most challenging problems in evolutionary computation is to select from its family of diverse solvers one that performs well on a given problem. This algorithm selection problem is complicated by the fact that different phases of the optimization process require different search behavior. While this can partly be controlled by the algorithm itself, there exist large differences between algorithm performance. It can therefore be beneficial to swap the configuration or even the entire algorithm during the run. Long deemed impractical, recent advances in Machine Learning and in exploratory landscape analysis give hope that this dynamic algorithm configuration~(dynAC) can eventually be solved by automatically trained configuration schedules. With this work we aim at promoting research on dynAC, by introducing a simpler variant that focuses only on switching between different algorithms, not configurations. Using the rich data from the Black Box Optimization Benchmark~(BBOB) platform, we show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in significant performance gains. We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these

    Maximum Likelihood-based Online Adaptation of Hyper-parameters in CMA-ES

    Get PDF
    The Covariance Matrix Adaptation Evolution Strategy (CMA-ES) is widely accepted as a robust derivative-free continuous optimization algorithm for non-linear and non-convex optimization problems. CMA-ES is well known to be almost parameterless, meaning that only one hyper-parameter, the population size, is proposed to be tuned by the user. In this paper, we propose a principled approach called self-CMA-ES to achieve the online adaptation of CMA-ES hyper-parameters in order to improve its overall performance. Experimental results show that for larger-than-default population size, the default settings of hyper-parameters of CMA-ES are far from being optimal, and that self-CMA-ES allows for dynamically approaching optimal settings.Comment: 13th International Conference on Parallel Problem Solving from Nature (PPSN 2014) (2014

    Black-box optimization benchmarking of IPOP-saACM-ES and BIPOP-saACM-ES on the BBOB-2012 noiseless testbed

    Get PDF
    International audienceIn this paper, we study the performance of IPOP-saACM-ES and BIPOP-saACM-ES, recently proposed self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategies. Both algorithms were tested using restarts till a total number of function evaluations of 10^6D was reached, where D is the dimension of the function search space. We compared surrogate-assisted algorithms with their surrogate-less versions IPOP-saACM-ES and BIPOP-saACM-ES, two algorithms with one of the best overall performance observed during the BBOB-2009 and BBOB-2010. The comparison shows that the surrogate-assisted versions outperform the original CMA-ES algorithms by a factor from 2 to 4 on 8 out of 24 noiseless benchmark problems, showing the best results among all algorithms of the BBOB-2009 and BBOB-2010 on Ellipsoid, Discus, Bent Cigar, Sharp Ridge and Sum of different powers functions
    corecore