In tensor completion tasks, the traditional low-rank tensor decomposition
models suffer from the laborious model selection problem due to their high
model sensitivity. In particular, for tensor ring (TR) decomposition, the
number of model possibilities grows exponentially with the tensor order, which
makes it rather challenging to find the optimal TR decomposition. In this
paper, by exploiting the low-rank structure of the TR latent space, we propose
a novel tensor completion method which is robust to model selection. In
contrast to imposing the low-rank constraint on the data space, we introduce
nuclear norm regularization on the latent TR factors, resulting in the
optimization step using singular value decomposition (SVD) being performed at a
much smaller scale. By leveraging the alternating direction method of
multipliers (ADMM) scheme, the latent TR factors with optimal rank and the
recovered tensor can be obtained simultaneously. Our proposed algorithm is
shown to effectively alleviate the burden of TR-rank selection, thereby greatly
reducing the computational cost. The extensive experimental results on both
synthetic and real-world data demonstrate the superior performance and
efficiency of the proposed approach against the state-of-the-art algorithms