4,005 research outputs found
MATSuMoTo: The MATLAB Surrogate Model Toolbox For Computationally Expensive Black-Box Global Optimization Problems
MATSuMoTo is the MATLAB Surrogate Model Toolbox for computationally
expensive, black-box, global optimization problems that may have continuous,
mixed-integer, or pure integer variables. Due to the black-box nature of the
objective function, derivatives are not available. Hence, surrogate models are
used as computationally cheap approximations of the expensive objective
function in order to guide the search for improved solutions. Due to the
computational expense of doing a single function evaluation, the goal is to
find optimal solutions within very few expensive evaluations. The multimodality
of the expensive black-box function requires an algorithm that is able to
search locally as well as globally. MATSuMoTo is able to address these
challenges. MATSuMoTo offers various choices for surrogate models and surrogate
model mixtures, initial experimental design strategies, and sampling
strategies. MATSuMoTo is able to do several function evaluations in parallel by
exploiting MATLAB's Parallel Computing Toolbox.Comment: 13 pages, 7 figure
Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control
Constrained optimization of high-dimensional numerical problems plays an
important role in many scientific and industrial applications. Function
evaluations in many industrial applications are severely limited and no
analytical information about objective function and constraint functions is
available. For such expensive black-box optimization tasks, the constraint
optimization algorithm COBRA was proposed, making use of RBF surrogate modeling
for both the objective and the constraint functions. COBRA has shown remarkable
success in solving reliably complex benchmark problems in less than 500
function evaluations. Unfortunately, COBRA requires careful adjustment of
parameters in order to do so.
In this work we present a new self-adjusting algorithm SACOBRA, which is
based on COBRA and capable to achieve high-quality results with very few
function evaluations and no parameter tuning. It is shown with the help of
performance profiles on a set of benchmark problems (G-problems, MOPTA08) that
SACOBRA consistently outperforms any COBRA algorithm with fixed parameter
setting. We analyze the importance of the several new elements in SACOBRA and
find that each element of SACOBRA plays a role to boost up the overall
optimization performance. We discuss the reasons behind and get in this way a
better understanding of high-quality RBF surrogate modeling
- …