323 research outputs found
Robust Structured Low-Rank Approximation on the Grassmannian
Over the past years Robust PCA has been established as a standard tool for
reliable low-rank approximation of matrices in the presence of outliers.
Recently, the Robust PCA approach via nuclear norm minimization has been
extended to matrices with linear structures which appear in applications such
as system identification and data series analysis. At the same time it has been
shown how to control the rank of a structured approximation via matrix
factorization approaches. The drawbacks of these methods either lie in the lack
of robustness against outliers or in their static nature of repeated
batch-processing. We present a Robust Structured Low-Rank Approximation method
on the Grassmannian that on the one hand allows for fast re-initialization in
an online setting due to subspace identification with manifolds, and that is
robust against outliers due to a smooth approximation of the -norm cost
function on the other hand. The method is evaluated in online time series
forecasting tasks on simulated and real-world data
Robust Subspace Learning: Robust PCA, Robust Subspace Tracking, and Robust Subspace Recovery
PCA is one of the most widely used dimension reduction techniques. A related
easier problem is "subspace learning" or "subspace estimation". Given
relatively clean data, both are easily solved via singular value decomposition
(SVD). The problem of subspace learning or PCA in the presence of outliers is
called robust subspace learning or robust PCA (RPCA). For long data sequences,
if one tries to use a single lower dimensional subspace to represent the data,
the required subspace dimension may end up being quite large. For such data, a
better model is to assume that it lies in a low-dimensional subspace that can
change over time, albeit gradually. The problem of tracking such data (and the
subspaces) while being robust to outliers is called robust subspace tracking
(RST). This article provides a magazine-style overview of the entire field of
robust subspace learning and tracking. In particular solutions for three
problems are discussed in detail: RPCA via sparse+low-rank matrix decomposition
(S+LR), RST via S+LR, and "robust subspace recovery (RSR)". RSR assumes that an
entire data vector is either an outlier or an inlier. The S+LR formulation
instead assumes that outliers occur on only a few data vector indices and hence
are well modeled as sparse corruptions.Comment: To appear, IEEE Signal Processing Magazine, July 201
- …