3,334 research outputs found
An Interior Point-Proximal Method of Multipliers for Convex Quadratic Programming
In this paper we combine an infeasible Interior Point Method (IPM) with the
Proximal Method of Multipliers (PMM). The resulting algorithm (IP-PMM) is
interpreted as a primal-dual regularized IPM, suitable for solving linearly
constrained convex quadratic programming problems. We apply few iterations of
the interior point method to each sub-problem of the proximal method of
multipliers. Once a satisfactory solution of the PMM sub-problem is found, we
update the PMM parameters, form a new IPM neighbourhood and repeat this
process. Given this framework, we prove polynomial complexity of the algorithm,
under standard assumptions. To our knowledge, this is the first polynomial
complexity result for a primal-dual regularized IPM. The algorithm is guided by
the use of a single penalty parameter; that of the logarithmic barrier. In
other words, we show that IP-PMM inherits the polynomial complexity of IPMs, as
well as the strict convexity of the PMM sub-problems. The updates of the
penalty parameter are controlled by IPM, and hence are well-tuned, and do not
depend on the problem solved. Furthermore, we study the behavior of the method
when it is applied to an infeasible problem, and identify a necessary condition
for infeasibility. The latter is used to construct an infeasibility detection
mechanism. Subsequently, we provide a robust implementation of the presented
algorithm and test it over a set of small to large scale linear and convex
quadratic programming problems. The numerical results demonstrate the benefits
of using regularization in IPMs as well as the reliability of the method
A new perspective on the complexity of interior point methods for linear programming
In a dynamical systems paradigm, many optimization algorithms are equivalent to applying forward Euler method to the system of ordinary differential equations defined by the vector field of the search directions. Thus the stiffness of such vector fields will play an essential role in the complexity of these methods. We first exemplify this point with a theoretical result for general linesearch methods for unconstrained optimization, which we further employ to investigating the complexity of a primal short-step path-following interior point method for linear programming. Our analysis involves showing that the Newton vector field associated to the primal logarithmic barrier is nonstiff in a sufficiently small and shrinking neighbourhood of its minimizer. Thus, by confining the iterates to these neighbourhoods of the primal central path, our algorithm has a nonstiff vector field of search directions, and we can give a worst-case bound on its iteration complexity. Furthermore, due to the generality of our vector field setting, we can perform a similar (global) iteration complexity analysis when the Newton direction of the interior point method is computed only approximately, using some direct method for solving linear systems of equations
Hessian barrier algorithms for linearly constrained optimization problems
In this paper, we propose an interior-point method for linearly constrained
optimization problems (possibly nonconvex). The method - which we call the
Hessian barrier algorithm (HBA) - combines a forward Euler discretization of
Hessian Riemannian gradient flows with an Armijo backtracking step-size policy.
In this way, HBA can be seen as an alternative to mirror descent (MD), and
contains as special cases the affine scaling algorithm, regularized Newton
processes, and several other iterative solution methods. Our main result is
that, modulo a non-degeneracy condition, the algorithm converges to the
problem's set of critical points; hence, in the convex case, the algorithm
converges globally to the problem's minimum set. In the case of linearly
constrained quadratic programs (not necessarily convex), we also show that the
method's convergence rate is for some
that depends only on the choice of kernel function (i.e., not on the problem's
primitives). These theoretical results are validated by numerical experiments
in standard non-convex test functions and large-scale traffic assignment
problems.Comment: 27 pages, 6 figure
- …