24 research outputs found
A globally convergent primal-dual interior-point filter method for nonlinear programming
In this paper, the filter technique of Fletcher and Leyffer (1997) is used to globalize the primal-dual interior-point algorithm for nonlinear programming, avoiding the use of merit functions and the updating of penalty parameters. The new algorithm decomposes the primal-dual step obtained from the perturbed first-order necessary conditions into a normal and a tangential step, whose sizes are controlled by a trust-region type parameter. Each entry in the filter is a pair of coordinates: one resulting from feasibility and centrality, and associated with the normal step; the other resulting from optimality (complementarity and duality), and related with the tangential step. Global convergence to first-order critical points is proved for the new primal-dual interior-point filter algorithm
Derivative-free optimization and filter methods to solve nonlinear constrained problems
In real optimization problems, usually the analytical expression of the objective function is not known, nor
its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where
the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search
Methods or Derivative-free Methods are one solution.
When the problem has constraints, penalty functions are often used. Unfortunately the choice of the
penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics
strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces
a function that aggregates the constrained violations and constructs a biobjective problem. In this problem
the step is accepted if it either reduces the objective function or the constrained violation. This implies that
the filter methods are less parameter dependent than a penalty function.
In this work, we present a new direct search method, based on simplex methods, for general constrained
optimization that combines the features of the simplex method and filter methods. This method does not
compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of
simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We
illustrate the behavior of our algorithm through some examples. The proposed methods were implemented
in Java
Combining filter method and dynamically dimensioned search for constrained global optimization
In this work we present an algorithm that combines the filter technique and the dynamically dimensioned search (DDS) for solving nonlinear and nonconvex constrained global optimization problems. The DDS is a stochastic global algorithm for solving bound constrained problems that in each iteration generates a randomly trial point perturbing some coordinates of the current best point. The filter technique controls the progress related to optimality and feasibility defining a forbidden region of points refused by the algorithm. This region can be given by the flat or slanting filter rule. The proposed algorithm does not compute or approximate any derivatives of the objective and constraint functions. Preliminary experiments show that the proposed algorithm gives competitive results when compared with other methods.The first author thanks a scholarship supported by the International
Cooperation Program CAPES/ COFECUB at the University of Minho.
The second and third authors thanks the support given by FCT (Funda¸c˜ao para
Ciˆencia e Tecnologia, Portugal) in the scope of the projects: UID/MAT/00013/2013
and UID/CEC/00319/2013. The fourth author was partially supported by CNPq-Brazil
grants 308957/2014-8 and 401288/2014-5.info:eu-repo/semantics/publishedVersio
An artificial fish swarm filter-based Method for constrained global optimization
Ana Maria A.C. Rocha, M. Fernanda P. Costa and Edite M.G.P. Fernandes, An Artificial Fish Swarm Filter-Based Method for Constrained Global Optimization, B. Murgante, O. Gervasi, S. Mirsa, N. Nedjah, A.M. Rocha, D. Taniar, B. Apduhan (Eds.), Lecture Notes in Computer Science, Part III, LNCS 7335, pp. 57–71, Springer, Heidelberg, 2012.An artificial fish swarm algorithm based on a filter methodology
for trial solutions acceptance is analyzed for general constrained
global optimization problems. The new method uses the filter set concept
to accept, at each iteration, a population of trial solutions whenever
they improve constraint violation or objective function, relative to the
current solutions. The preliminary numerical experiments with a wellknown
benchmark set of engineering design problems show the effectiveness
of the proposed method.Fundação para a Ciência e a Tecnologia (FCT
A Filter Algorithm with Inexact Line Search
A filter algorithm with inexact line search is proposed
for solving nonlinear programming problems. The filter is constructed by employing
the norm of the gradient of the Lagrangian function to the infeasibility
measure. Transition to superlinear local convergence is showed for the proposed
filter algorithm without second-order correction. Under mild conditions, the
global convergence can also be derived. Numerical experiments show the efficiency
of the algorithm
A primal-dual interior-point relaxation method with adaptively updating barrier for nonlinear programs
Based on solving an equivalent parametric equality constrained mini-max
problem of the classic logarithmic-barrier subproblem, we present a novel
primal-dual interior-point relaxation method for nonlinear programs. In the
proposed method, the barrier parameter is updated in every step as done in
interior-point methods for linear programs, which is prominently different from
the existing interior-point methods and the relaxation methods for nonlinear
programs. Since our update for the barrier parameter is autonomous and
adaptive, the method has potential of avoiding the possible difficulties caused
by the unappropriate initial selection of the barrier parameter and speeding up
the convergence to the solution. Moreover, it can circumvent the jamming
difficulty of global convergence caused by the interior-point restriction for
nonlinear programs and improve the ill conditioning of the existing primal-dual
interiorpoint methods as the barrier parameter is small. Under suitable
assumptions, our method is proved to be globally convergent and locally
quadratically convergent. The preliminary numerical results on a well-posed
problem for which many line-search interior-point methods fail to find the
minimizer and a set of test problems from the CUTE collection show that our
method is efficient.Comment: submitted to SIOPT on April 14, 202