This paper focuses on the minimization of a sum of a twice continuously
differentiable function f and a nonsmooth convex function. We propose an
inexact regularized proximal Newton method by an approximation of the Hessian
∇2f(x) involving the ϱth power of the KKT residual. For
ϱ=0, we demonstrate the global convergence of the iterate sequence for
the KL objective function and its R-linear convergence rate for the KL
objective function of exponent 1/2. For ϱ∈(0,1), we establish the
global convergence of the iterate sequence and its superlinear convergence rate
of order q(1+ϱ) under an assumption that cluster points satisfy a
local H\"{o}lderian local error bound of order
q∈(max(ϱ,1+ϱ1),1] on the strong stationary point set;
and when cluster points satisfy a local error bound of order q>1+ϱ on
the common stationary point set, we also obtain the global convergence of the
iterate sequence, and its superlinear convergence rate of order
q(q−ϱ)2 if q>22ϱ+1+4ϱ+1. A dual
semismooth Newton augmented Lagrangian method is developed for seeking an
inexact minimizer of subproblem. Numerical comparisons with two
state-of-the-art methods on ℓ1-regularized Student's t-regression,
group penalized Student's t-regression, and nonconvex image restoration
confirm the efficiency of the proposed method