15,774 research outputs found
Differential evolution algorithms for constrained global optimization
In this thesis we propose four new methods for solving constrained global optimization problems.
The first proposed algorithm is a differential evolution (DE) algorithm using penalty
functions for constraint handling. The second algorithm is based on the first DE algorithm
but also incorporates a filter set as a diversification mechanism. The third algorithm is also
based on DE but includes an additional local refinement process in the form of the pattern
search (PS) technique. The last algorithm incorporates both the filter set and PS into the DE
algorithm for constrained global optimization. The superiority of feasible points (SFP) and
the parameter free penalty (PFP) schemes are used as constraint handling mechanisms.
The new algorithms were numerically tested using two sets of test problems and the
results where compared with those of the genetic algorithm (GA). The comparison shows
that the new algorithms outperformed GA. When the new methods are compared to each
other, the last three methods performed better than the first method i.e. the DE algorithm.
The new algorithms show promising results with potential for further research.
Keywords: constrained global optimization, differential evolution, pattern search, filter
method, penalty function, superiority of feasible points, parameter free penalty.
i
Modified constrained differential evolution for solving nonlinear global optimization problems
Nonlinear optimization problems introduce the possibility of
multiple local optima. The task of global optimization is to find a point
where the objective function obtains its most extreme value while satisfying
the constraints. Some methods try to make the solution feasible
by using penalty function methods, but the performance is not always
satisfactory since the selection of the penalty parameters for the problem
at hand is not a straightforward issue. Differential evolution has
shown to be very efficient when solving global optimization problems
with simple bounds. In this paper, we propose a modified constrained
differential evolution based on different constraints handling techniques,
namely, feasibility and dominance rules, stochastic ranking and global
competitive ranking and compare their performances on a benchmark
set of problems. A comparison with other solution methods available in
literature is also provided. The convergence behavior of the algorithm to
handle discrete and integer variables is analyzed using four well-known
mixed-integer engineering design problems. It is shown that our method
is rather effective when solving nonlinear optimization problems.Fundação para a Ciência e a Tecnologia (FCT
Solving the G-problems in less than 500 iterations: Improved efficient constrained optimization by surrogate modeling and adaptive parameter control
Constrained optimization of high-dimensional numerical problems plays an
important role in many scientific and industrial applications. Function
evaluations in many industrial applications are severely limited and no
analytical information about objective function and constraint functions is
available. For such expensive black-box optimization tasks, the constraint
optimization algorithm COBRA was proposed, making use of RBF surrogate modeling
for both the objective and the constraint functions. COBRA has shown remarkable
success in solving reliably complex benchmark problems in less than 500
function evaluations. Unfortunately, COBRA requires careful adjustment of
parameters in order to do so.
In this work we present a new self-adjusting algorithm SACOBRA, which is
based on COBRA and capable to achieve high-quality results with very few
function evaluations and no parameter tuning. It is shown with the help of
performance profiles on a set of benchmark problems (G-problems, MOPTA08) that
SACOBRA consistently outperforms any COBRA algorithm with fixed parameter
setting. We analyze the importance of the several new elements in SACOBRA and
find that each element of SACOBRA plays a role to boost up the overall
optimization performance. We discuss the reasons behind and get in this way a
better understanding of high-quality RBF surrogate modeling
- …