An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization

Abstract

This paper focuses on the minimization of a sum of a twice continuously differentiable function ff and a nonsmooth convex function. We propose an inexact regularized proximal Newton method by an approximation of the Hessian 2 ⁣f(x)\nabla^2\!f(x) involving the ϱ\varrhoth power of the KKT residual. For ϱ=0\varrho=0, we demonstrate the global convergence of the iterate sequence for the KL objective function and its RR-linear convergence rate for the KL objective function of exponent 1/21/2. For ϱ(0,1)\varrho\in(0,1), we establish the global convergence of the iterate sequence and its superlinear convergence rate of order q(1 ⁣+ ⁣ϱ)q(1\!+\!\varrho) under an assumption that cluster points satisfy a local H\"{o}lderian local error bound of order q(max(ϱ,11+ϱ),1]q\in(\max(\varrho,\frac{1}{1+\varrho}),1] on the strong stationary point set; and when cluster points satisfy a local error bound of order q>1+ϱq>1+\varrho on the common stationary point set, we also obtain the global convergence of the iterate sequence, and its superlinear convergence rate of order (qϱ)2q\frac{(q-\varrho)^2}{q} if q>2ϱ+1+4ϱ+12q>\frac{2\varrho+1+\sqrt{4\varrho+1}}{2}. A dual semismooth Newton augmented Lagrangian method is developed for seeking an inexact minimizer of subproblem. Numerical comparisons with two state-of-the-art methods on 1\ell_1-regularized Student's tt-regression, group penalized Student's tt-regression, and nonconvex image restoration confirm the efficiency of the proposed method

    Similar works

    Full text

    thumbnail-image

    Available Versions