15,887 research outputs found
Online Matrix Completion and Online Robust PCA
This work studies two interrelated problems - online robust PCA (RPCA) and
online low-rank matrix completion (MC). In recent work by Cand\`{e}s et al.,
RPCA has been defined as a problem of separating a low-rank matrix (true data),
and a sparse
matrix (outliers), from their
sum, . Our work uses this definition of RPCA. An important application
where both these problems occur is in video analytics in trying to separate
sparse foregrounds (e.g., moving objects) and slowly changing backgrounds.
While there has been a large amount of recent work on both developing and
analyzing batch RPCA and batch MC algorithms, the online problem is largely
open. In this work, we develop a practical modification of our recently
proposed algorithm to solve both the online RPCA and online MC problems. The
main contribution of this work is that we obtain correctness results for the
proposed algorithms under mild assumptions. The assumptions that we need are:
(a) a good estimate of the initial subspace is available (easy to obtain using
a short sequence of background-only frames in video surveillance); (b) the
's obey a `slow subspace change' assumption; (c) the basis vectors for
the subspace from which is generated are dense (non-sparse); (d) the
support of changes by at least a certain amount at least every so often;
and (e) algorithm parameters are appropriately setComment: Presented at ISIT (IEEE Intnl. Symp. on Information Theory), 2015.
Submitted to IEEE Transactions on Information Theory. This version: changes
are in blue; the main changes are just to explain the model assumptions
better (added based on ISIT reviewers' comments
A Nonconvex Projection Method for Robust PCA
Robust principal component analysis (RPCA) is a well-studied problem with the
goal of decomposing a matrix into the sum of low-rank and sparse components. In
this paper, we propose a nonconvex feasibility reformulation of RPCA problem
and apply an alternating projection method to solve it. To the best of our
knowledge, we are the first to propose a method that solves RPCA problem
without considering any objective function, convex relaxation, or surrogate
convex constraints. We demonstrate through extensive numerical experiments on a
variety of applications, including shadow removal, background estimation, face
detection, and galaxy evolution, that our approach matches and often
significantly outperforms current state-of-the-art in various ways.Comment: In the proceedings of Thirty-Third AAAI Conference on Artificial
Intelligence (AAAI-19
Robust Subspace Learning: Robust PCA, Robust Subspace Tracking, and Robust Subspace Recovery
PCA is one of the most widely used dimension reduction techniques. A related
easier problem is "subspace learning" or "subspace estimation". Given
relatively clean data, both are easily solved via singular value decomposition
(SVD). The problem of subspace learning or PCA in the presence of outliers is
called robust subspace learning or robust PCA (RPCA). For long data sequences,
if one tries to use a single lower dimensional subspace to represent the data,
the required subspace dimension may end up being quite large. For such data, a
better model is to assume that it lies in a low-dimensional subspace that can
change over time, albeit gradually. The problem of tracking such data (and the
subspaces) while being robust to outliers is called robust subspace tracking
(RST). This article provides a magazine-style overview of the entire field of
robust subspace learning and tracking. In particular solutions for three
problems are discussed in detail: RPCA via sparse+low-rank matrix decomposition
(S+LR), RST via S+LR, and "robust subspace recovery (RSR)". RSR assumes that an
entire data vector is either an outlier or an inlier. The S+LR formulation
instead assumes that outliers occur on only a few data vector indices and hence
are well modeled as sparse corruptions.Comment: To appear, IEEE Signal Processing Magazine, July 201
- …