In recent years, the proximal gradient method and its variants have been
generalized to Riemannian manifolds for solving optimization problems with an
additively separable structure, i.e., f+h, where f is continuously
differentiable, and h may be nonsmooth but convex with computationally
reasonable proximal mapping. In this paper, we generalize the proximal Newton
method to embedded submanifolds for solving the type of problem with h(x)=μ∥x∥1​. The generalization relies on the Weingarten and semismooth
analysis. It is shown that the Riemannian proximal Newton method has a local
superlinear convergence rate under certain reasonable assumptions. Moreover, a
hybrid version is given by concatenating a Riemannian proximal gradient method
and the Riemannian proximal Newton method. It is shown that if the objective
function satisfies the Riemannian KL property and the switch parameter is
chosen appropriately, then the hybrid method converges globally and also has a
local superlinear convergence rate. Numerical experiments on random and
synthetic data are used to demonstrate the performance of the proposed methods