16,936 research outputs found
Adaptive Ranking Based Constraint Handling for Explicitly Constrained Black-Box Optimization
A novel explicit constraint handling technique for the covariance matrix
adaptation evolution strategy (CMA-ES) is proposed. The proposed constraint
handling exhibits two invariance properties. One is the invariance to arbitrary
element-wise increasing transformation of the objective and constraint
functions. The other is the invariance to arbitrary affine transformation of
the search space. The proposed technique virtually transforms a constrained
optimization problem into an unconstrained optimization problem by considering
an adaptive weighted sum of the ranking of the objective function values and
the ranking of the constraint violations that are measured by the Mahalanobis
distance between each candidate solution to its projection onto the boundary of
the constraints. Simulation results are presented and show that the CMA-ES with
the proposed constraint handling exhibits the affine invariance and performs
similarly to the CMA-ES on unconstrained counterparts.Comment: 9 page
Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control
Constrained optimization of high-dimensional numerical problems plays an
important role in many scientific and industrial applications. Function
evaluations in many industrial applications are severely limited and no
analytical information about objective function and constraint functions is
available. For such expensive black-box optimization tasks, the constraint
optimization algorithm COBRA was proposed, making use of RBF surrogate modeling
for both the objective and the constraint functions. COBRA has shown remarkable
success in solving reliably complex benchmark problems in less than 500
function evaluations. Unfortunately, COBRA requires careful adjustment of
parameters in order to do so.
In this work we present a new self-adjusting algorithm SACOBRA, which is
based on COBRA and capable to achieve high-quality results with very few
function evaluations and no parameter tuning. It is shown with the help of
performance profiles on a set of benchmark problems (G-problems, MOPTA08) that
SACOBRA consistently outperforms any COBRA algorithm with fixed parameter
setting. We analyze the importance of the several new elements in SACOBRA and
find that each element of SACOBRA plays a role to boost up the overall
optimization performance. We discuss the reasons behind and get in this way a
better understanding of high-quality RBF surrogate modeling
- …