191 research outputs found
Low-Rank Tensor Recovery with Euclidean-Norm-Induced Schatten-p Quasi-Norm Regularization
The nuclear norm and Schatten- quasi-norm of a matrix are popular rank
proxies in low-rank matrix recovery. Unfortunately, computing the nuclear norm
or Schatten- quasi-norm of a tensor is NP-hard, which is a pity for low-rank
tensor completion (LRTC) and tensor robust principal component analysis
(TRPCA). In this paper, we propose a new class of rank regularizers based on
the Euclidean norms of the CP component vectors of a tensor and show that these
regularizers are monotonic transformations of tensor Schatten- quasi-norm.
This connection enables us to minimize the Schatten- quasi-norm in LRTC and
TRPCA implicitly. The methods do not use the singular value decomposition and
hence scale to big tensors. Moreover, the methods are not sensitive to the
choice of initial rank and provide an arbitrarily sharper rank proxy for
low-rank tensor recovery compared to nuclear norm. We provide theoretical
guarantees in terms of recovery error for LRTC and TRPCA, which show relatively
smaller of Schatten- quasi-norm leads to tighter error bounds.
Experiments using LRTC and TRPCA on synthetic data and natural images verify
the effectiveness and superiority of our methods compared to baseline methods
- …