5 research outputs found

    High Dimensional Low Rank plus Sparse Matrix Decomposition

    Full text link
    This paper is concerned with the problem of low rank plus sparse matrix decomposition for big data. Conventional algorithms for matrix decomposition use the entire data to extract the low-rank and sparse components, and are based on optimization problems with complexity that scales with the dimension of the data, which limits their scalability. Furthermore, existing randomized approaches mostly rely on uniform random sampling, which is quite inefficient for many real world data matrices that exhibit additional structures (e.g. clustering). In this paper, a scalable subspace-pursuit approach that transforms the decomposition problem to a subspace learning problem is proposed. The decomposition is carried out using a small data sketch formed from sampled columns/rows. Even when the data is sampled uniformly at random, it is shown that the sufficient number of sampled columns/rows is roughly O(r\mu), where \mu is the coherency parameter and r the rank of the low rank component. In addition, adaptive sampling algorithms are proposed to address the problem of column/row sampling from structured data. We provide an analysis of the proposed method with adaptive sampling and show that adaptive sampling makes the required number of sampled columns/rows invariant to the distribution of the data. The proposed approach is amenable to online implementation and an online scheme is proposed.Comment: IEEE Transactions on Signal Processin

    Performance guarantees for ReProCS - Correlated low-rank matrix entries case

    No full text

    Correctness results for on-line robust principal components analysis

    Get PDF
    This work studies two interrelated problems - online robust PCA (RPCA) and online low-rank matrix completion (MC). In recent work by Candes et al., RPCA has been defined as a problem of separating a low-rank matrix (true data), L:=[ℓ₁, ℓ₂, ... ℓt, ... , ℓtmax and a sparse matrix (outliers), S:=[x₁, x₂, ... xt, ..., xtmax] from their sum, M:=L+S. Our work uses this definition of RPCA. An important application where both these problems occur is in video analytics in trying to separate sparse foregrounds (e.g., moving objects) and slowly changing backgrounds. While there has been a large amount of recent work on both developing and analyzing batch RPCA and batch MC algorithms, the online problem is largely open. In this work, we develop a practical modification of our recently proposed algorithm to solve both the online RPCA and online MC problems. The main contribution of this work is that we obtain correctness results for the proposed algorithms under mild assumptions. The assumptions that we need are: (a) a good estimate of the initial subspace is available (easy to obtain using a short sequence of background-only frames in video surveillance); (b) the ℓt\u27s obey a `slow subspace change\u27 assumption; (c) the basis vectors for the subspace from which ℓt is generated are dense (non-sparse); (d) the support of xt changes by at least a certain amount at least every so often; and (e) algorithm parameters are appropriately set
    corecore