9,373 research outputs found

    Entry-wise Matrix Completion from Noisy Entries

    Get PDF
    We address the problem of entry-wise low-rank matrix completion in the noisy observation model. We propose a new noise robust estimator where we characterize the bias and variance of the estimator in a finite sample setting. Utilizing this estimator, we provide a new robust local matrix completion algorithm that outperforms other classic methods in reconstructing large rectangular matrices arising in a wide range of applications such as athletic performance prediction and recommender systems. The simulation results on synthetic and real data show that our algorithm outperforms other state-of-the-art and baseline algorithms in matrix completion in reconstructing rectangular matrices

    Matrix Completion With Noise

    Get PDF
    On the heels of compressed sensing, a remarkable new field has very recently emerged. This field addresses a broad range of problems of significant practical interest, namely, the recovery of a data matrix from what appears to be incomplete, and perhaps even corrupted, information. In its simplest form, the problem is to recover a matrix from a small sample of its entries, and comes up in many areas of science and engineering including collaborative filtering, machine learning, control, remote sensing, and computer vision to name a few. This paper surveys the novel literature on matrix completion, which shows that under some suitable conditions, one can recover an unknown low-rank matrix from a nearly minimal set of entries by solving a simple convex optimization problem, namely, nuclear-norm minimization subject to data constraints. Further, this paper introduces novel results showing that matrix completion is provably accurate even when the few observed entries are corrupted with a small amount of noise. A typical result is that one can recover an unknown n x n matrix of low rank r from just about nr log^2 n noisy samples with an error which is proportional to the noise level. We present numerical results which complement our quantitative analysis and show that, in practice, nuclear norm minimization accurately fills in the many missing entries of large low-rank matrices from just a few noisy samples. Some analogies between matrix completion and compressed sensing are discussed throughout.Comment: 11 pages, 4 figures, 1 tabl

    Calibrated Elastic Regularization in Matrix Completion

    Full text link
    This paper concerns the problem of matrix completion, which is to estimate a matrix from observations in a small subset of indices. We propose a calibrated spectrum elastic net method with a sum of the nuclear and Frobenius penalties and develop an iterative algorithm to solve the convex minimization problem. The iterative algorithm alternates between imputing the missing entries in the incomplete matrix by the current guess and estimating the matrix by a scaled soft-thresholding singular value decomposition of the imputed matrix until the resulting matrix converges. A calibration step follows to correct the bias caused by the Frobenius penalty. Under proper coherence conditions and for suitable penalties levels, we prove that the proposed estimator achieves an error bound of nearly optimal order and in proportion to the noise level. This provides a unified analysis of the noisy and noiseless matrix completion problems. Simulation results are presented to compare our proposal with previous ones.Comment: 9 pages; Advances in Neural Information Processing Systems, NIPS 201
    • …
    corecore