14,648 research outputs found
Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information
We consider variants of trust-region and cubic regularization methods for
non-convex optimization, in which the Hessian matrix is approximated. Under
mild conditions on the inexact Hessian, and using approximate solution of the
corresponding sub-problems, we provide iteration complexity to achieve -approximate second-order optimality which have shown to be tight.
Our Hessian approximation conditions constitute a major relaxation over the
existing ones in the literature. Consequently, we are able to show that such
mild conditions allow for the construction of the approximate Hessian through
various random sampling methods. In this light, we consider the canonical
problem of finite-sum minimization, provide appropriate uniform and non-uniform
sub-sampling strategies to construct such Hessian approximations, and obtain
optimal iteration complexity for the corresponding sub-sampled trust-region and
cubic regularization methods.Comment: 32 page
Distributed Interior-point Method for Loosely Coupled Problems
In this paper, we put forth distributed algorithms for solving loosely
coupled unconstrained and constrained optimization problems. Such problems are
usually solved using algorithms that are based on a combination of
decomposition and first order methods. These algorithms are commonly very slow
and require many iterations to converge. In order to alleviate this issue, we
propose algorithms that combine the Newton and interior-point methods with
proximal splitting methods for solving such problems. Particularly, the
algorithm for solving unconstrained loosely coupled problems, is based on
Newton's method and utilizes proximal splitting to distribute the computations
for calculating the Newton step at each iteration. A combination of this
algorithm and the interior-point method is then used to introduce a distributed
algorithm for solving constrained loosely coupled problems. We also provide
guidelines on how to implement the proposed methods efficiently and briefly
discuss the properties of the resulting solutions.Comment: Submitted to the 19th IFAC World Congress 201
An optimal subgradient algorithm for large-scale convex optimization in simple domains
This paper shows that the optimal subgradient algorithm, OSGA, proposed in
\cite{NeuO} can be used for solving structured large-scale convex constrained
optimization problems. Only first-order information is required, and the
optimal complexity bounds for both smooth and nonsmooth problems are attained.
More specifically, we consider two classes of problems: (i) a convex objective
with a simple closed convex domain, where the orthogonal projection on this
feasible domain is efficiently available; (ii) a convex objective with a simple
convex functional constraint. If we equip OSGA with an appropriate
prox-function, the OSGA subproblem can be solved either in a closed form or by
a simple iterative scheme, which is especially important for large-scale
problems. We report numerical results for some applications to show the
efficiency of the proposed scheme. A software package implementing OSGA for
above domains is available
- …