5,280 research outputs found
A warped kernel improving robustness in Bayesian optimization via random embeddings
This works extends the Random Embedding Bayesian Optimization approach by
integrating a warping of the high dimensional subspace within the covariance
kernel. The proposed warping, that relies on elementary geometric
considerations, allows mitigating the drawbacks of the high extrinsic
dimensionality while avoiding the algorithm to evaluate points giving redundant
information. It also alleviates constraints on bound selection for the embedded
domain, thus improving the robustness, as illustrated with a test case with 25
variables and intrinsic dimension 6
Hyperparameter Learning via Distributional Transfer
Bayesian optimisation is a popular technique for hyperparameter learning but
typically requires initial exploration even in cases where similar prior tasks
have been solved. We propose to transfer information across tasks using learnt
representations of training datasets used in those tasks. This results in a
joint Gaussian process model on hyperparameters and data representations.
Representations make use of the framework of distribution embeddings into
reproducing kernel Hilbert spaces. The developed method has a faster
convergence compared to existing baselines, in some cases requiring only a few
evaluations of the target objective
ZOOpt: Toolbox for Derivative-Free Optimization
Recent advances of derivative-free optimization allow efficient approximating
the global optimal solutions of sophisticated functions, such as functions with
many local optima, non-differentiable and non-continuous functions. This
article describes the ZOOpt (https://github.com/eyounx/ZOOpt) toolbox that
provides efficient derivative-free solvers and are designed easy to use. ZOOpt
provides a Python package for single-thread optimization, and a light-weighted
distributed version with the help of the Julia language for Python described
functions. ZOOpt toolbox particularly focuses on optimization problems in
machine learning, addressing high-dimensional, noisy, and large-scale problems.
The toolbox is being maintained toward ready-to-use tool in real-world machine
learning tasks
- …