9,810 research outputs found
Recommended from our members
Predictive Entropy Search for Bayesian optimization with unknown constraints
Unknown constraints arise in many types of expensive black-box optimization
problems. Several methods have been proposed recently for performing Bayesian
optimization with constraints, based on the expected improvement (EI)
heuristic. However, EI can lead to pathologies when used with constraints. For
example, in the case of decoupled constraints---i.e., when one can
independently evaluate the objective or the constraints---EI can encounter a
pathology that prevents exploration. Additionally, computing EI requires a
current best solution, which may not exist if none of the data collected so far
satisfy the constraints. By contrast, information-based approaches do not
suffer from these failure modes. In this paper, we present a new
information-based method called Predictive Entropy Search with Constraints
(PESC). We analyze the performance of PESC and show that it compares favorably
to EI-based approaches on synthetic and benchmark problems, as well as several
real-world examples. We demonstrate that PESC is an effective algorithm that
provides a promising direction towards a unified solution for constrained
Bayesian optimization.José Miguel Hernández-Lobato acknowledges support
from the Rafael del Pino Foundation. Zoubin Ghahramani
acknowledges support from Google Focused Research
Award and EPSRC grant EP/I036575/1. Matthew
W. Hoffman acknowledges support from EPSRC grant
EP/J012300/1.This is the final published version. It first appeared at http://jmlr.org/proceedings/papers/v37/hernandez-lobatob15.html
Bayesian Optimization with Unknown Constraints
Recent work on Bayesian optimization has shown its effectiveness in global
optimization of difficult black-box objective functions. Many real-world
optimization problems of interest also have constraints which are unknown a
priori. In this paper, we study Bayesian optimization for constrained problems
in the general case that noise may be present in the constraint functions, and
the objective and constraints may be evaluated independently. We provide
motivating practical examples, and present a general framework to solve such
problems. We demonstrate the effectiveness of our approach on optimizing the
performance of online latent Dirichlet allocation subject to topic sparsity
constraints, tuning a neural network given test-time memory constraints, and
optimizing Hamiltonian Monte Carlo to achieve maximal effectiveness in a fixed
time, subject to passing standard convergence diagnostics.Comment: 14 pages, 3 figure
Predictive Entropy Search for Efficient Global Optimization of Black-box Functions
We propose a novel information-theoretic approach for Bayesian optimization
called Predictive Entropy Search (PES). At each iteration, PES selects the next
evaluation point that maximizes the expected information gained with respect to
the global maximum. PES codifies this intractable acquisition function in terms
of the expected reduction in the differential entropy of the predictive
distribution. This reformulation allows PES to obtain approximations that are
both more accurate and efficient than other alternatives such as Entropy Search
(ES). Furthermore, PES can easily perform a fully Bayesian treatment of the
model hyperparameters while ES cannot. We evaluate PES in both synthetic and
real-world applications, including optimization problems in machine learning,
finance, biotechnology, and robotics. We show that the increased accuracy of
PES leads to significant gains in optimization performance
- …