1,810 research outputs found
Completion of High Order Tensor Data with Missing Entries via Tensor-train Decomposition
In this paper, we aim at the completion problem of high order tensor data
with missing entries. The existing tensor factorization and completion methods
suffer from the curse of dimensionality when the order of tensor N>>3. To
overcome this problem, we propose an efficient algorithm called TT-WOPT
(Tensor-train Weighted OPTimization) to find the latent core tensors of tensor
data and recover the missing entries. Tensor-train decomposition, which has the
powerful representation ability with linear scalability to tensor order, is
employed in our algorithm. The experimental results on synthetic data and
natural image completion demonstrate that our method significantly outperforms
the other related methods. Especially when the missing rate of data is very
high, e.g., 85% to 99%, our algorithm can achieve much better performance than
other state-of-the-art algorithms.Comment: 8 pages, ICONIP 201
High-order Tensor Completion for Data Recovery via Sparse Tensor-train Optimization
In this paper, we aim at the problem of tensor data completion. Tensor-train
decomposition is adopted because of its powerful representation ability and
linear scalability to tensor order. We propose an algorithm named Sparse
Tensor-train Optimization (STTO) which considers incomplete data as sparse
tensor and uses first-order optimization method to find the factors of
tensor-train decomposition. Our algorithm is shown to perform well in
simulation experiments at both low-order cases and high-order cases. We also
employ a tensorization method to transform data to a higher-order form to
enhance the performance of our algorithm. The results of image recovery
experiments in various cases manifest that our method outperforms other
completion algorithms. Especially when the missing rate is very high, e.g.,
90\% to 99\%, our method is significantly better than the state-of-the-art
methods.Comment: 5 pages (include 1 page of reference) ICASSP 201
Efficient tensor completion for color image and video recovery: Low-rank tensor train
This paper proposes a novel approach to tensor completion, which recovers
missing entries of data represented by tensors. The approach is based on the
tensor train (TT) rank, which is able to capture hidden information from
tensors thanks to its definition from a well-balanced matricization scheme.
Accordingly, new optimization formulations for tensor completion are proposed
as well as two new algorithms for their solution. The first one called simple
low-rank tensor completion via tensor train (SiLRTC-TT) is intimately related
to minimizing a nuclear norm based on TT rank. The second one is from a
multilinear matrix factorization model to approximate the TT rank of a tensor,
and is called tensor completion by parallel matrix factorization via tensor
train (TMac-TT). A tensor augmentation scheme of transforming a low-order
tensor to higher-orders is also proposed to enhance the effectiveness of
SiLRTC-TT and TMac-TT. Simulation results for color image and video recovery
show the clear advantage of our method over all other methods.Comment: Submitted to the IEEE Transactions on Image Processing. arXiv admin
note: substantial text overlap with arXiv:1601.0108
Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition
The problem of incomplete data is common in signal processing and machine
learning. Tensor completion algorithms aim to recover the incomplete data from
its partially observed entries. In this paper, taking advantages of high
compressibility and flexibility of recently proposed tensor ring (TR)
decomposition, we propose a new tensor completion approach named tensor ring
weighted optimization (TR-WOPT). It finds the latent factors of the incomplete
tensor by gradient descent algorithm, then the latent factors are employed to
predict the missing entries of the tensor. We conduct various tensor completion
experiments on synthetic data and real-world data. The simulation results show
that TR-WOPT performs well in various high-dimension tensors. Furthermore,
image completion results show that our proposed algorithm outperforms the
state-of-the-art algorithms in many situations. Especially when the missing
rate of the test images is high (e.g., over 0.9), the performance of our
TR-WOPT is significantly better than the compared algorithms.Comment: APSIPA2018 conference paper. arXiv admin note: substantial text
overlap with arXiv:1805.0846
Efficient tensor completion: Low-rank tensor train
This paper proposes a novel formulation of the tensor completion problem to
impute missing entries of data represented by tensors. The formulation is
introduced in terms of tensor train (TT) rank which can effectively capture
global information of tensors thanks to its construction by a well-balanced
matricization scheme. Two algorithms are proposed to solve the corresponding
tensor completion problem. The first one called simple low-rank tensor
completion via tensor train (SiLRTC-TT) is intimately related to minimizing the
TT nuclear norm. The second one is based on a multilinear matrix factorization
model to approximate the TT rank of the tensor and called tensor completion by
parallel matrix factorization via tensor train (TMac-TT). These algorithms are
applied to complete both synthetic and real world data tensors. Simulation
results of synthetic data show that the proposed algorithms are efficient in
estimating missing entries for tensors with either low Tucker rank or TT rank
while Tucker-based algorithms are only comparable in the case of low Tucker
rank tensors. When applied to recover color images represented by ninth-order
tensors augmented from third-order ones, the proposed algorithms outperforms
the Tucker-based algorithms.Comment: 11 pages, 9 figure
Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion
In tensor completion tasks, the traditional low-rank tensor decomposition
models suffer from the laborious model selection problem due to their high
model sensitivity. In particular, for tensor ring (TR) decomposition, the
number of model possibilities grows exponentially with the tensor order, which
makes it rather challenging to find the optimal TR decomposition. In this
paper, by exploiting the low-rank structure of the TR latent space, we propose
a novel tensor completion method which is robust to model selection. In
contrast to imposing the low-rank constraint on the data space, we introduce
nuclear norm regularization on the latent TR factors, resulting in the
optimization step using singular value decomposition (SVD) being performed at a
much smaller scale. By leveraging the alternating direction method of
multipliers (ADMM) scheme, the latent TR factors with optimal rank and the
recovered tensor can be obtained simultaneously. Our proposed algorithm is
shown to effectively alleviate the burden of TR-rank selection, thereby greatly
reducing the computational cost. The extensive experimental results on both
synthetic and real-world data demonstrate the superior performance and
efficiency of the proposed approach against the state-of-the-art algorithms
Concatenated image completion via tensor augmentation and completion
This paper proposes a novel framework called concatenated image completion
via tensor augmentation and completion (ICTAC), which recovers missing entries
of color images with high accuracy. Typical images are second- or third-order
tensors (2D/3D) depending if they are grayscale or color, hence tensor
completion algorithms are ideal for their recovery. The proposed framework
performs image completion by concatenating copies of a single image that has
missing entries into a third-order tensor, applying a dimensionality
augmentation technique to the tensor, utilizing a tensor completion algorithm
for recovering its missing entries, and finally extracting the recovered image
from the tensor. The solution relies on two key components that have been
recently proposed to take advantage of the tensor train (TT) rank: A tensor
augmentation tool called ket augmentation (KA) that represents a low-order
tensor by a higher-order tensor, and the algorithm tensor completion by
parallel matrix factorization via tensor train (TMac-TT), which has been
demonstrated to outperform state-of-the-art tensor completion algorithms.
Simulation results for color image recovery show the clear advantage of our
framework against current state-of-the-art tensor completion algorithms.Comment: 7 pages, 6 figures, submitted to ICSPCS 201
Tensor train rank minimization with nonlocal self-similarity for tensor completion
The tensor train (TT) rank has received increasing attention in tensor
completion due to its ability to capture the global correlation of high-order
tensors (). For third order visual data, direct TT rank
minimization has not exploited the potential of TT rank for high-order tensors.
The TT rank minimization accompany with \emph{ket augmentation}, which
transforms a lower-order tensor (e.g., visual data) into a higher-order tensor,
suffers from serious block-artifacts. To tackle this issue, we suggest the TT
rank minimization with nonlocal self-similarity for tensor completion by
simultaneously exploring the spatial, temporal/spectral, and nonlocal
redundancy in visual data. More precisely, the TT rank minimization is
performed on a formed higher-order tensor called group by stacking similar
cubes, which naturally and fully takes advantage of the ability of TT rank for
high-order tensors. Moreover, the perturbation analysis for the TT low-rankness
of each group is established. We develop the alternating direction method of
multipliers tailored for the specific structure to solve the proposed model.
Extensive experiments demonstrate that the proposed method is superior to
several existing state-of-the-art methods in terms of both qualitative and
quantitative measures
Efficient Low Rank Tensor Ring Completion
Using the matrix product state (MPS) representation of the recently proposed
tensor ring decompositions, in this paper we propose a tensor completion
algorithm, which is an alternating minimization algorithm that alternates over
the factors in the MPS representation. This development is motivated in part by
the success of matrix completion algorithms that alternate over the (low-rank)
factors. In this paper, we propose a spectral initialization for the tensor
ring completion algorithm and analyze the computational complexity of the
proposed algorithm. We numerically compare it with existing methods that employ
a low rank tensor train approximation for data completion and show that our
method outperforms the existing ones for a variety of real computer vision
settings, and thus demonstrate the improved expressive power of tensor ring as
compared to tensor train.Comment: in Proc. ICCV, Oct. 2017. arXiv admin note: text overlap with
arXiv:1609.0558
Tensor Completion Algorithms in Big Data Analytics
Tensor completion is a problem of filling the missing or unobserved entries
of partially observed tensors. Due to the multidimensional character of tensors
in describing complex datasets, tensor completion algorithms and their
applications have received wide attention and achievement in areas like data
mining, computer vision, signal processing, and neuroscience. In this survey,
we provide a modern overview of recent advances in tensor completion algorithms
from the perspective of big data analytics characterized by diverse variety,
large volume, and high velocity. We characterize these advances from four
perspectives: general tensor completion algorithms, tensor completion with
auxiliary information (variety), scalable tensor completion algorithms
(volume), and dynamic tensor completion algorithms (velocity). Further, we
identify several tensor completion applications on real-world data-driven
problems and present some common experimental frameworks popularized in the
literature. Our goal is to summarize these popular methods and introduce them
to researchers and practitioners for promoting future research and
applications. We conclude with a discussion of key challenges and promising
research directions in this community for future exploration
- …