When faced with a specific optimization problem, choosing which algorithm to
use is always a tough task. Not only is there a vast variety of algorithms to
select from, but these algorithms often are controlled by many hyperparameters,
which need to be tuned in order to achieve the best performance possible.
Usually, this problem is separated into two parts: algorithm selection and
algorithm configuration. With the significant advances made in Machine
Learning, however, these problems can be integrated into a combined algorithm
selection and hyperparameter optimization task, commonly known as the CASH
problem. In this work we compare sequential and integrated algorithm selection
and configuration approaches for the case of selecting and tuning the best out
of 4608 variants of the Covariance Matrix Adaptation Evolution Strategy
(CMA-ES) tested on the Black Box Optimization Benchmark (BBOB) suite. We first
show that the ranking of the modular CMA-ES variants depends to a large extent
on the quality of the hyperparameters. This implies that even a sequential
approach based on complete enumeration of the algorithm space will likely
result in sub-optimal solutions. In fact, we show that the integrated approach
manages to provide competitive results at a much smaller computational cost. We
also compare two different mixed-integer algorithm configuration techniques,
called irace and Mixed-Integer Parallel Efficient Global Optimization
(MIP-EGO). While we show that the two methods differ significantly in their
treatment of the exploration-exploitation balance, their overall performances
are very similar