858 research outputs found
Finding a low-rank basis in a matrix subspace
For a given matrix subspace, how can we find a basis that consists of
low-rank matrices? This is a generalization of the sparse vector problem. It
turns out that when the subspace is spanned by rank-1 matrices, the matrices
can be obtained by the tensor CP decomposition. For the higher rank case, the
situation is not as straightforward. In this work we present an algorithm based
on a greedy process applicable to higher rank problems. Our algorithm first
estimates the minimum rank by applying soft singular value thresholding to a
nuclear norm relaxation, and then computes a matrix with that rank using the
method of alternating projections. We provide local convergence results, and
compare our algorithm with several alternative approaches. Applications include
data compression beyond the classical truncated SVD, computing accurate
eigenvectors of a near-multiple eigenvalue, image separation and graph
Laplacian eigenproblems
A Splitting Augmented Lagrangian Method for Low Multilinear-Rank Tensor Recovery
This paper studies a recovery task of finding a low multilinear-rank tensor
that fulfills some linear constraints in the general settings, which has many
applications in computer vision and graphics. This problem is named as the low
multilinear-rank tensor recovery problem. The variable splitting technique and
convex relaxation technique are used to transform this problem into a tractable
constrained optimization problem. Considering the favorable structure of the
problem, we develop a splitting augmented Lagrangian method to solve the
resulting problem. The proposed algorithm is easily implemented and its
convergence can be proved under some conditions. Some preliminary numerical
results on randomly generated and real completion problems show that the
proposed algorithm is very effective and robust for tackling the low
multilinear-rank tensor completion problem
- …