89,869 research outputs found
Fast gradient method for Low-Rank Matrix Estimation
Projected gradient descent and its Riemannian variant belong to a typical
class of methods for low-rank matrix estimation. This paper proposes a new
Nesterov's Accelerated Riemannian Gradient algorithm by efficient orthographic
retraction and tangent space projection. The subspace relationship between
iterative and extrapolated sequences on the low-rank matrix manifold provides a
computational convenience. With perturbation analysis of truncated singular
value decomposition and two retractions, we systematically analyze the local
convergence of gradient algorithms and Nesterov's variants in the Euclidean and
Riemannian settings. Theoretically, we estimate the exact rate of local linear
convergence under different parameters using the spectral radius in a closed
form and give the optimal convergence rate and the corresponding momentum
parameter. When the parameter is unknown, the adaptive restart scheme can avoid
the oscillation problem caused by high momentum, thus approaching the optimal
convergence rate. Extensive numerical experiments confirm the estimations of
convergence rate and demonstrate that the proposed algorithm is competitive
with first-order methods for matrix completion and matrix sensing.Comment: Accepted for publication in Journal of Scientific Computin
Bayesian Matrix Completion via Adaptive Relaxed Spectral Regularization
Bayesian matrix completion has been studied based on a low-rank matrix
factorization formulation with promising results. However, little work has been
done on Bayesian matrix completion based on the more direct spectral
regularization formulation. We fill this gap by presenting a novel Bayesian
matrix completion method based on spectral regularization. In order to
circumvent the difficulties of dealing with the orthonormality constraints of
singular vectors, we derive a new equivalent form with relaxed constraints,
which then leads us to design an adaptive version of spectral regularization
feasible for Bayesian inference. Our Bayesian method requires no parameter
tuning and can infer the number of latent factors automatically. Experiments on
synthetic and real datasets demonstrate encouraging results on rank recovery
and collaborative filtering, with notably good results for very sparse
matrices.Comment: Accepted to AAAI 201
On the Power of Adaptivity in Matrix Completion and Approximation
We consider the related tasks of matrix completion and matrix approximation
from missing data and propose adaptive sampling procedures for both problems.
We show that adaptive sampling allows one to eliminate standard incoherence
assumptions on the matrix row space that are necessary for passive sampling
procedures. For exact recovery of a low-rank matrix, our algorithm judiciously
selects a few columns to observe in full and, with few additional measurements,
projects the remaining columns onto their span. This algorithm exactly recovers
an rank matrix using observations,
where is a coherence parameter on the column space of the matrix. In
addition to completely eliminating any row space assumptions that have pervaded
the literature, this algorithm enjoys a better sample complexity than any
existing matrix completion algorithm. To certify that this improvement is due
to adaptive sampling, we establish that row space coherence is necessary for
passive sampling algorithms to achieve non-trivial sample complexity bounds.
For constructing a low-rank approximation to a high-rank input matrix, we
propose a simple algorithm that thresholds the singular values of a zero-filled
version of the input matrix. The algorithm computes an approximation that is
nearly as good as the best rank- approximation using
samples, where is a slightly different coherence parameter on the matrix
columns. Again we eliminate assumptions on the row space
- …