Proximal methods are known to identify the underlying substructure of
nonsmooth optimization problems. Even more, in many interesting situations, the
output of a proximity operator comes with its structure at no additional cost,
and convergence is improved once it matches the structure of a minimizer.
However, it is impossible in general to know whether the current structure is
final or not; such highly valuable information has to be exploited adaptively.
To do so, we place ourselves in the case where a proximal gradient method can
identify manifolds of differentiability of the nonsmooth objective. Leveraging
this manifold identification, we show that Riemannian Newton-like methods can
be intertwined with the proximal gradient steps to drastically boost the
convergence. We prove the superlinear convergence of the algorithm when solving
some nondegenerated nonsmooth nonconvex optimization problems. We provide
numerical illustrations on optimization problems regularized by ℓ1-norm
or trace-norm