1,244 research outputs found
Online Selection of CMA-ES Variants
In the field of evolutionary computation, one of the most challenging topics
is algorithm selection. Knowing which heuristics to use for which optimization
problem is key to obtaining high-quality solutions. We aim to extend this
research topic by taking a first step towards a selection method for adaptive
CMA-ES algorithms. We build upon the theoretical work done by van Rijn
\textit{et al.} [PPSN'18], in which the potential of switching between
different CMA-ES variants was quantified in the context of a modular CMA-ES
framework.
We demonstrate in this work that their proposed approach is not very
reliable, in that implementing the suggested adaptive configurations does not
yield the predicted performance gains. We propose a revised approach, which
results in a more robust fit between predicted and actual performance. The
adaptive CMA-ES approach obtains performance gains on 18 out of 24 tested
functions of the BBOB benchmark, with stable advantages of up to 23\%. An
analysis of module activation indicates which modules are most crucial for the
different phases of optimizing each of the 24 benchmark problems. The module
activation also suggests that additional gains are possible when including the
(B)IPOP modules, which we have excluded for this present work.Comment: To appear at Genetic and Evolutionary Computation Conference
(GECCO'19) Appendix will be added in due tim
An Asynchronous Implementation of the Limited Memory CMA-ES
We present our asynchronous implementation of the LM-CMA-ES algorithm, which
is a modern evolution strategy for solving complex large-scale continuous
optimization problems. Our implementation brings the best results when the
number of cores is relatively high and the computational complexity of the
fitness function is also high. The experiments with benchmark functions show
that it is able to overcome its origin on the Sphere function, reaches certain
thresholds faster on the Rosenbrock and Ellipsoid function, and surprisingly
performs much better than the original version on the Rastrigin function.Comment: 9 pages, 4 figures, 4 tables; this is a full version of a paper which
has been accepted as a poster to IEEE ICMLA conference 201
Landscape-Aware Fixed-Budget Performance Regression and Algorithm Selection for Modular CMA-ES Variants
Automated algorithm selection promises to support the user in the decisive
task of selecting a most suitable algorithm for a given problem. A common
component of these machine-trained techniques are regression models which
predict the performance of a given algorithm on a previously unseen problem
instance. In the context of numerical black-box optimization, such regression
models typically build on exploratory landscape analysis (ELA), which
quantifies several characteristics of the problem. These measures can be used
to train a supervised performance regression model.
First steps towards ELA-based performance regression have been made in the
context of a fixed-target setting. In many applications, however, the user
needs to select an algorithm that performs best within a given budget of
function evaluations. Adopting this fixed-budget setting, we demonstrate that
it is possible to achieve high-quality performance predictions with
off-the-shelf supervised learning approaches, by suitably combining two
differently trained regression models. We test this approach on a very
challenging problem: algorithm selection on a portfolio of very similar
algorithms, which we choose from the family of modular CMA-ES algorithms.Comment: To appear in Proc. of Genetic and Evolutionary Computation Conference
(GECCO'20
Sequential vs. Integrated Algorithm Selection and Configuration: A Case Study for the Modular CMA-ES
When faced with a specific optimization problem, choosing which algorithm to
use is always a tough task. Not only is there a vast variety of algorithms to
select from, but these algorithms often are controlled by many hyperparameters,
which need to be tuned in order to achieve the best performance possible.
Usually, this problem is separated into two parts: algorithm selection and
algorithm configuration. With the significant advances made in Machine
Learning, however, these problems can be integrated into a combined algorithm
selection and hyperparameter optimization task, commonly known as the CASH
problem. In this work we compare sequential and integrated algorithm selection
and configuration approaches for the case of selecting and tuning the best out
of 4608 variants of the Covariance Matrix Adaptation Evolution Strategy
(CMA-ES) tested on the Black Box Optimization Benchmark (BBOB) suite. We first
show that the ranking of the modular CMA-ES variants depends to a large extent
on the quality of the hyperparameters. This implies that even a sequential
approach based on complete enumeration of the algorithm space will likely
result in sub-optimal solutions. In fact, we show that the integrated approach
manages to provide competitive results at a much smaller computational cost. We
also compare two different mixed-integer algorithm configuration techniques,
called irace and Mixed-Integer Parallel Efficient Global Optimization
(MIP-EGO). While we show that the two methods differ significantly in their
treatment of the exploration-exploitation balance, their overall performances
are very similar
- âŠ