9 research outputs found

    Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions

    No full text
    In this paper, we study accelerated regularized Newton methods for minimizing objectives formed as a sum of two functions: one is convex and twice differentiable with Hölder-continuous Hessian, and the other is a simple closed convex function

    Accelerated regularized Newton methods for minimizing composite convex functions

    No full text
    In this paper, we study accelerated Regularized Newton Methods for minimizing objectives formed as a sum of two functions: one is convex and twice differentiable with Hölder-continuous Hessian, and the other is a simple closed convex function. For the case in which the Hölder parameter ν ε [0, 1] is known, we propose methods that take at most O(1ε^{1/(2+ν)}) iterations to reduce the functional residual below a given precision ε > 0. For the general case, in which the ν is not known, we propose a universal method that ensures the same precision in at most O(1/ε^{2/[3(1+ν)]}) iterations

    Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions

    No full text
    corecore