An adaptive radial basis algorithm (ARBF) for expensive black-box global optimization’.

Abstract

Abstract Response surface methods based on kriging and radial basis function (RBF) interpolation have been successfully applied to solve expensive, i.e. computationally costly, global black-box nonconvex optimization problems. In this paper we describe extensions of these methods to handle linear, nonlinear, and integer constraints. In particular, algorithms for standard RBF and the new adaptive RBF (ARBF) are described. Note, however, while the objective function may be expensive, we assume that any nonlinear constraints are either inexpensive or are incorporated into the objective function via penalty terms. Test results are presented on standard test problems, both nonconvex problems with linear and nonlinear constraints, and mixed-integer nonlinear problems (MINLP). Solvers in the TOMLAB Optimization Environment (http://tomopt.com/tomlab/) have been compared, specifically the three deterministic derivative-free solvers rbfSolve, ARBFMIP and EGO with three derivative-based mixed-integer nonlinear solvers, OQNLP, MINLPBB and MISQP, as well as the GENO solver implementing a stochastic genetic algorithm. Results show that the deterministic derivative-free methods compare well with the derivative-based ones, but the stochastic genetic algorithm solver is several orders of magnitude too slow for practical use. When the objective function for the test problems is costly to evaluate, the performance of the ARBF algorithm proves to be superior

    Similar works

    Full text

    thumbnail-image

    Available Versions