750 research outputs found
A dimensionality reduction technique for unconstrained global optimization of functions with low effective dimensionality
We investigate the unconstrained global optimization of functions with low
effective dimensionality, that are constant along certain (unknown) linear
subspaces. Extending the technique of random subspace embeddings in [Wang et
al., Bayesian optimization in a billion dimensions via random embeddings. JAIR,
55(1): 361--387, 2016], we study a generic Random Embeddings for Global
Optimization (REGO) framework that is compatible with any global minimization
algorithm. Instead of the original, potentially large-scale optimization
problem, within REGO, a Gaussian random, low-dimensional problem with bound
constraints is formulated and solved in a reduced space. We provide novel
probabilistic bounds for the success of REGO in solving the original, low
effective-dimensionality problem, which show its independence of the
(potentially large) ambient dimension and its precise dependence on the
dimensions of the effective and randomly embedding subspaces. These results
significantly improve existing theoretical analyses by providing the exact
distribution of a reduced minimizer and its Euclidean norm and by the general
assumptions required on the problem. We validate our theoretical findings by
extensive numerical testing of REGO with three types of global optimization
solvers, illustrating the improved scalability of REGO compared to the
full-dimensional application of the respective solvers.Comment: 32 pages, 10 figures, submitted to Information and Inference: a
journal of the IMA, also submitted to optimization-online repositor
BOCK : Bayesian Optimization with Cylindrical Kernels
A major challenge in Bayesian Optimization is the boundary issue (Swersky,
2017) where an algorithm spends too many evaluations near the boundary of its
search space. In this paper, we propose BOCK, Bayesian Optimization with
Cylindrical Kernels, whose basic idea is to transform the ball geometry of the
search space using a cylindrical transformation. Because of the transformed
geometry, the Gaussian Process-based surrogate model spends less budget
searching near the boundary, while concentrating its efforts relatively more
near the center of the search region, where we expect the solution to be
located. We evaluate BOCK extensively, showing that it is not only more
accurate and efficient, but it also scales successfully to problems with a
dimensionality as high as 500. We show that the better accuracy and scalability
of BOCK even allows optimizing modestly sized neural network layers, as well as
neural network hyperparameters.Comment: 10 pages, 5 figures, 5 tables, 1 algorith
- …