2 research outputs found
Tensor train rank minimization with nonlocal self-similarity for tensor completion
The tensor train (TT) rank has received increasing attention in tensor
completion due to its ability to capture the global correlation of high-order
tensors (). For third order visual data, direct TT rank
minimization has not exploited the potential of TT rank for high-order tensors.
The TT rank minimization accompany with \emph{ket augmentation}, which
transforms a lower-order tensor (e.g., visual data) into a higher-order tensor,
suffers from serious block-artifacts. To tackle this issue, we suggest the TT
rank minimization with nonlocal self-similarity for tensor completion by
simultaneously exploring the spatial, temporal/spectral, and nonlocal
redundancy in visual data. More precisely, the TT rank minimization is
performed on a formed higher-order tensor called group by stacking similar
cubes, which naturally and fully takes advantage of the ability of TT rank for
high-order tensors. Moreover, the perturbation analysis for the TT low-rankness
of each group is established. We develop the alternating direction method of
multipliers tailored for the specific structure to solve the proposed model.
Extensive experiments demonstrate that the proposed method is superior to
several existing state-of-the-art methods in terms of both qualitative and
quantitative measures
Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery
As low-rank modeling has achieved great success in tensor recovery, many
research efforts devote to defining the tensor rank. Among them, the recent
popular tensor tubal rank, defined based on the tensor singular value
decomposition (t-SVD), obtains promising results. However, the framework of the
t-SVD and the tensor tubal rank are applicable only to three-way tensors and
lack of flexibility to handle different correlations along different modes. To
tackle these two issues, we define a new tensor unfolding operator, named
mode- tensor unfolding, as the process of lexicographically stacking
the mode- slices of an -way tensor into a three-way tensor, which is
a three-way extension of the well-known mode- tensor matricization. Based on
it, we define a novel tensor rank, the tensor -tubal rank, as a vector whose
elements contain the tubal rank of all mode- unfolding tensors, to
depict the correlations along different modes. To efficiently minimize the
proposed -tubal rank, we establish its convex relaxation: the weighted sum
of tensor nuclear norm (WSTNN). Then, we apply WSTNN to low-rank tensor
completion (LRTC) and tensor robust principal component analysis (TRPCA). The
corresponding WSTNN-based LRTC and TRPCA models are proposed, and two efficient
alternating direction method of multipliers (ADMM)-based algorithms are
developed to solve the proposed models. Numerical experiments demonstrate that
the proposed models significantly outperform the compared ones