5,307 research outputs found
Batch Bayesian Optimization via Local Penalization
The popularity of Bayesian optimization methods for efficient exploration of
parameter spaces has lead to a series of papers applying Gaussian processes as
surrogates in the optimization of functions. However, most proposed approaches
only allow the exploration of the parameter space to occur sequentially. Often,
it is desirable to simultaneously propose batches of parameter values to
explore. This is particularly the case when large parallel processing
facilities are available. These facilities could be computational or physical
facets of the process being optimized. E.g. in biological experiments many
experimental set ups allow several samples to be simultaneously processed.
Batch methods, however, require modeling of the interaction between the
evaluations in the batch, which can be expensive in complex scenarios. We
investigate a simple heuristic based on an estimate of the Lipschitz constant
that captures the most important aspect of this interaction (i.e. local
repulsion) at negligible computational overhead. The resulting algorithm
compares well, in running time, with much more elaborate alternatives. The
approach assumes that the function of interest, , is a Lipschitz continuous
function. A wrap-loop around the acquisition function is used to collect
batches of points of certain size minimizing the non-parallelizable
computational effort. The speed-up of our method with respect to previous
approaches is significant in a set of computationally expensive experiments.Comment: 11 pages, 10 figure
Recommended from our members
Bayesian single- and multi- objective optimisation with nonparametric priors
Optimisation is integral to all sorts of processes in science, economics and arguably underpins the fruition of human intelligence through millions of years of optimisation, or . Scarce resources make it crucial to maximise their efficient usage. In this thesis, we consider the task of maximising unknown functions which we are able to query point-wise. The function is deemed to be to evaluate e.g. larger run time or financial expense, requiring a judicious querying strategy given previous observations.
We adopt a probabilistic framework for modelling the unknown function and Bayesian non-parametric modelling. In particular, we focus on the (GP), a popular non-parametric Bayesian prior on functions. We motivate these choices and give an overview of the Gaussian process in the introduction, and its application to .
A GP's behaviour is intimately controlled by the choice of or covariance function, typically chosen to be a parametric function. In chapter 2 we instead place a non-parametric Bayesian prior, known as an Inverse Wishart process prior, over a GP kernel function, and show that this may be marginalised analytically leading to a \textit{Student-t process} (TP). Furthermore we explore a larger class of , and show that the TP is the most general for which analytic calculation is possible, and apply it successfully for Bayesian optimisation.
The remainder of the thesis focusses on various Bayesian optimisation settings.
In chapter 3, we consider a setting where we are able to evaluate a function at multiple locations in parallel. Our approach is to consider a measure of information, , to decide which batch of points to evaluate a function at next. We similarly apply information gain for Bayesian optimisation in chapter 4. Here, one wishes to find a of efficient settings with respect to several different objectives through sequential evaluation. Finally, in chapter 5 we exploit the idea that in a multi-objective setting, functions are , incorporating this belief in our choice of prior distribution over the multiple objectives
GPflowOpt: A Bayesian Optimization Library using TensorFlow
A novel Python framework for Bayesian optimization known as GPflowOpt is
introduced. The package is based on the popular GPflow library for Gaussian
processes, leveraging the benefits of TensorFlow including automatic
differentiation, parallelization and GPU computations for Bayesian
optimization. Design goals focus on a framework that is easy to extend with
custom acquisition functions and models. The framework is thoroughly tested and
well documented, and provides scalability. The current released version of
GPflowOpt includes some standard single-objective acquisition functions, the
state-of-the-art max-value entropy search, as well as a Bayesian
multi-objective approach. Finally, it permits easy use of custom modeling
strategies implemented in GPflow
- …