3 research outputs found
Convex optimization based on global lower second-order models
In this paper, we present new second-order algorithms for composite convex
optimization, called Contracting-domain Newton methods. These algorithms are
affine-invariant and based on global second-order lower approximation for the
smooth component of the objective. Our approach has an interpretation both as a
second-order generalization of the conditional gradient method, or as a variant
of trust-region scheme. Under the assumption, that the problem domain is
bounded, we prove global rate of convergence in
functional residual, where is the iteration counter, minimizing convex
functions with Lipschitz continuous Hessian. This significantly improves the
previously known bound for this type of algorithms.
Additionally, we propose a stochastic extension of our method, and present
computational results for solving empirical risk minimization problem
Regularized Newton methods for minimizing functions with Hölder continuous Hessians
In this paper, we study the regularized second-order methods for unconstrained minimization of a twice-differentiable (convex or nonconvex) objective function. For the current function, these methods automatically achieve the best possible global complexity estimates among different Hölder classes containing the Hessian of the objective. We show that such methods for functional residual and for the norm of the gradient must be different. For development of the latter methods, we introduced two new line-search acceptance criteria, which can be seen as generalizations of the Armijo condition