55,318 research outputs found
On the Power of Adaptivity in Matrix Completion and Approximation
We consider the related tasks of matrix completion and matrix approximation
from missing data and propose adaptive sampling procedures for both problems.
We show that adaptive sampling allows one to eliminate standard incoherence
assumptions on the matrix row space that are necessary for passive sampling
procedures. For exact recovery of a low-rank matrix, our algorithm judiciously
selects a few columns to observe in full and, with few additional measurements,
projects the remaining columns onto their span. This algorithm exactly recovers
an rank matrix using observations,
where is a coherence parameter on the column space of the matrix. In
addition to completely eliminating any row space assumptions that have pervaded
the literature, this algorithm enjoys a better sample complexity than any
existing matrix completion algorithm. To certify that this improvement is due
to adaptive sampling, we establish that row space coherence is necessary for
passive sampling algorithms to achieve non-trivial sample complexity bounds.
For constructing a low-rank approximation to a high-rank input matrix, we
propose a simple algorithm that thresholds the singular values of a zero-filled
version of the input matrix. The algorithm computes an approximation that is
nearly as good as the best rank- approximation using
samples, where is a slightly different coherence parameter on the matrix
columns. Again we eliminate assumptions on the row space
Adaptive Matrix Completion for the Users and the Items in Tail
Recommender systems are widely used to recommend the most appealing items to
users. These recommendations can be generated by applying collaborative
filtering methods. The low-rank matrix completion method is the
state-of-the-art collaborative filtering method. In this work, we show that the
skewed distribution of ratings in the user-item rating matrix of real-world
datasets affects the accuracy of matrix-completion-based approaches. Also, we
show that the number of ratings that an item or a user has positively
correlates with the ability of low-rank matrix-completion-based approaches to
predict the ratings for the item or the user accurately. Furthermore, we use
these insights to develop four matrix completion-based approaches, i.e.,
Frequency Adaptive Rating Prediction (FARP), Truncated Matrix Factorization
(TMF), Truncated Matrix Factorization with Dropout (TMF + Dropout) and Inverse
Frequency Weighted Matrix Factorization (IFWMF), that outperforms traditional
matrix-completion-based approaches for the users and the items with few ratings
in the user-item rating matrix.Comment: 7 pages, 3 figures, ACM WWW'1
Transformed Schatten-1 Iterative Thresholding Algorithms for Low Rank Matrix Completion
We study a non-convex low-rank promoting penalty function, the transformed
Schatten-1 (TS1), and its applications in matrix completion. The TS1 penalty,
as a matrix quasi-norm defined on its singular values, interpolates the rank
and the nuclear norm through a nonnegative parameter a. We consider the
unconstrained TS1 regularized low-rank matrix recovery problem and develop a
fixed point representation for its global minimizer. The TS1 thresholding
functions are in closed analytical form for all parameter values. The TS1
threshold values differ in subcritical (supercritical) parameter regime where
the TS1 threshold functions are continuous (discontinuous). We propose TS1
iterative thresholding algorithms and compare them with some state-of-the-art
algorithms on matrix completion test problems. For problems with known rank, a
fully adaptive TS1 iterative thresholding algorithm consistently performs the
best under different conditions with ground truth matrix being multivariate
Gaussian at varying covariance. For problems with unknown rank, TS1 algorithms
with an additional rank estimation procedure approach the level of IRucL-q
which is an iterative reweighted algorithm, non-convex in nature and best in
performance
A Riemannian rank-adaptive method for low-rank matrix completion
The low-rank matrix completion problem can be solved by Riemannian
optimization on a fixed-rank manifold. However, a drawback of the known
approaches is that the rank parameter has to be fixed a priori. In this paper,
we consider the optimization problem on the set of bounded-rank matrices. We
propose a Riemannian rank-adaptive method, which consists of fixed-rank
optimization, rank increase step and rank reduction step. We explore its
performance applied to the low-rank matrix completion problem. Numerical
experiments on synthetic and real-world datasets illustrate that the proposed
rank-adaptive method compares favorably with state-of-the-art algorithms. In
addition, it shows that one can incorporate each aspect of this rank-adaptive
framework separately into existing algorithms for the purpose of improving
performance.Comment: 22 pages, 12 figures, 1 tabl
Bayesian Matrix Completion via Adaptive Relaxed Spectral Regularization
Bayesian matrix completion has been studied based on a low-rank matrix
factorization formulation with promising results. However, little work has been
done on Bayesian matrix completion based on the more direct spectral
regularization formulation. We fill this gap by presenting a novel Bayesian
matrix completion method based on spectral regularization. In order to
circumvent the difficulties of dealing with the orthonormality constraints of
singular vectors, we derive a new equivalent form with relaxed constraints,
which then leads us to design an adaptive version of spectral regularization
feasible for Bayesian inference. Our Bayesian method requires no parameter
tuning and can infer the number of latent factors automatically. Experiments on
synthetic and real datasets demonstrate encouraging results on rank recovery
and collaborative filtering, with notably good results for very sparse
matrices.Comment: Accepted to AAAI 201
A Characterization of Deterministic Sampling Patterns for Low-Rank Matrix Completion
Low-rank matrix completion (LRMC) problems arise in a wide variety of
applications. Previous theory mainly provides conditions for completion under
missing-at-random samplings. This paper studies deterministic conditions for
completion. An incomplete matrix is finitely rank- completable
if there are at most finitely many rank- matrices that agree with all its
observed entries. Finite completability is the tipping point in LRMC, as a few
additional samples of a finitely completable matrix guarantee its unique
completability. The main contribution of this paper is a deterministic sampling
condition for finite completability. We use this to also derive deterministic
sampling conditions for unique completability that can be efficiently verified.
We also show that under uniform random sampling schemes, these conditions are
satisfied with high probability if entries per column are
observed. These findings have several implications on LRMC regarding lower
bounds, sample and computational complexity, the role of coherence, adaptive
settings and the validation of any completion algorithm. We complement our
theoretical results with experiments that support our findings and motivate
future analysis of uncharted sampling regimes.Comment: This update corrects an error in version 2 of this paper, where we
erroneously assumed that columns with more than r+1 observed entries would
yield multiple independent constraint
- …