3 research outputs found

    Convex optimization based on global lower second-order models

    Full text link
    In this paper, we present new second-order algorithms for composite convex optimization, called Contracting-domain Newton methods. These algorithms are affine-invariant and based on global second-order lower approximation for the smooth component of the objective. Our approach has an interpretation both as a second-order generalization of the conditional gradient method, or as a variant of trust-region scheme. Under the assumption, that the problem domain is bounded, we prove O(1/k2)\mathcal{O}(1/k^{2}) global rate of convergence in functional residual, where kk is the iteration counter, minimizing convex functions with Lipschitz continuous Hessian. This significantly improves the previously known bound O(1/k)\mathcal{O}(1/k) for this type of algorithms. Additionally, we propose a stochastic extension of our method, and present computational results for solving empirical risk minimization problem

    Regularized Newton methods for minimizing functions with Hölder continuous Hessians

    No full text
    In this paper, we study the regularized second-order methods for unconstrained minimization of a twice-differentiable (convex or nonconvex) objective function. For the current function, these methods automatically achieve the best possible global complexity estimates among different Hölder classes containing the Hessian of the objective. We show that such methods for functional residual and for the norm of the gradient must be different. For development of the latter methods, we introduced two new line-search acceptance criteria, which can be seen as generalizations of the Armijo condition
    corecore