We apply a state-of-the-art, local derivative-free solver, Py-BOBYQA, to
global optimization problems, and propose an algorithmic improvement that is
beneficial in this context. Our numerical findings are illustrated on a
commonly-used but small-scale test set of global optimization problems and
associated noisy variants, and on hyperparameter tuning for the machine
learning test set MNIST. As Py-BOBYQA is a model-based trust-region method, we
compare mostly (but not exclusively) with other global optimization methods for
which (global) models are important, such as Bayesian optimization and response
surface methods; we also consider state-of-the-art representative deterministic
and stochastic codes, such as DIRECT and CMA-ES. As a heuristic for escaping
local minima, we find numerically that Py-BOBYQA is competitive with global
optimization solvers for all accuracy/budget regimes, in both smooth and noisy
settings. In particular, Py-BOBYQA variants are best performing for smooth and
multiplicative noise problems in high-accuracy regimes. As a by-product, some
preliminary conclusions can be drawn on the relative performance of the global
solvers we have tested with default settings