83,362 research outputs found
Robust sparse principal component analysis.
A method for principal component analysis is proposed that is sparse and robust at the same time. The sparsity delivers principal components that have loadings on a small number of variables, making them easier to interpret. The robustness makes the analysis resistant to outlying observations. The principal components correspond to directions that maximize a robust measure of the variance, with an additional penalty term to take sparseness into account. We propose an algorithm to compute the sparse and robust principal components. The method is applied on several real data examples, and diagnostic plots for detecting outliers and for selecting the degree of sparsity are provided. A simulation experiment studies the loss in statistical efficiency by requiring both robustness and sparsity.Dispersion measure; Projection-pursuit; Outliers; Variable selection;
Robust functional principal components: A projection-pursuit approach
In many situations, data are recorded over a period of time and may be
regarded as realizations of a stochastic process. In this paper, robust
estimators for the principal components are considered by adapting the
projection pursuit approach to the functional data setting. Our approach
combines robust projection-pursuit with different smoothing methods.
Consistency of the estimators are shown under mild assumptions. The performance
of the classical and robust procedures are compared in a simulation study under
different contamination schemes.Comment: Published in at http://dx.doi.org/10.1214/11-AOS923 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Using Robust PCA to estimate regional characteristics of language use from geo-tagged Twitter messages
Principal component analysis (PCA) and related techniques have been
successfully employed in natural language processing. Text mining applications
in the age of the online social media (OSM) face new challenges due to
properties specific to these use cases (e.g. spelling issues specific to texts
posted by users, the presence of spammers and bots, service announcements,
etc.). In this paper, we employ a Robust PCA technique to separate typical
outliers and highly localized topics from the low-dimensional structure present
in language use in online social networks. Our focus is on identifying
geospatial features among the messages posted by the users of the Twitter
microblogging service. Using a dataset which consists of over 200 million
geolocated tweets collected over the course of a year, we investigate whether
the information present in word usage frequencies can be used to identify
regional features of language use and topics of interest. Using the PCA pursuit
method, we are able to identify important low-dimensional features, which
constitute smoothly varying functions of the geographic location
Robust Subspace Learning: Robust PCA, Robust Subspace Tracking, and Robust Subspace Recovery
PCA is one of the most widely used dimension reduction techniques. A related
easier problem is "subspace learning" or "subspace estimation". Given
relatively clean data, both are easily solved via singular value decomposition
(SVD). The problem of subspace learning or PCA in the presence of outliers is
called robust subspace learning or robust PCA (RPCA). For long data sequences,
if one tries to use a single lower dimensional subspace to represent the data,
the required subspace dimension may end up being quite large. For such data, a
better model is to assume that it lies in a low-dimensional subspace that can
change over time, albeit gradually. The problem of tracking such data (and the
subspaces) while being robust to outliers is called robust subspace tracking
(RST). This article provides a magazine-style overview of the entire field of
robust subspace learning and tracking. In particular solutions for three
problems are discussed in detail: RPCA via sparse+low-rank matrix decomposition
(S+LR), RST via S+LR, and "robust subspace recovery (RSR)". RSR assumes that an
entire data vector is either an outlier or an inlier. The S+LR formulation
instead assumes that outliers occur on only a few data vector indices and hence
are well modeled as sparse corruptions.Comment: To appear, IEEE Signal Processing Magazine, July 201
Structural Analysis of Network Traffic Matrix via Relaxed Principal Component Pursuit
The network traffic matrix is widely used in network operation and
management. It is therefore of crucial importance to analyze the components and
the structure of the network traffic matrix, for which several mathematical
approaches such as Principal Component Analysis (PCA) were proposed. In this
paper, we first argue that PCA performs poorly for analyzing traffic matrix
that is polluted by large volume anomalies, and then propose a new
decomposition model for the network traffic matrix. According to this model, we
carry out the structural analysis by decomposing the network traffic matrix
into three sub-matrices, namely, the deterministic traffic, the anomaly traffic
and the noise traffic matrix, which is similar to the Robust Principal
Component Analysis (RPCA) problem previously studied in [13]. Based on the
Relaxed Principal Component Pursuit (Relaxed PCP) method and the Accelerated
Proximal Gradient (APG) algorithm, we present an iterative approach for
decomposing a traffic matrix, and demonstrate its efficiency and flexibility by
experimental results. Finally, we further discuss several features of the
deterministic and noise traffic. Our study develops a novel method for the
problem of structural analysis of the traffic matrix, which is robust against
pollution of large volume anomalies.Comment: Accepted to Elsevier Computer Network
High breakdown estimators for principal components: the projection-pursuit approach revisited.
Li and Chen (J. Amer. Statist. Assoc. 80 (1985) 759) proposed a method for principal components using projection-pursuit techniques. In classical principal components one searches for directions with maximal variance, and their approach consists of replacing this variance by a robust scale measure. Li and Chen showed that this estimator is consistent, qualitative robust and inherits the breakdown point of the robust scale estimator. We complete their study by deriving the influence function of the estimators for the eigenvectors, eigenvalues and the associated dispersion matrix. Corresponding Gaussian efficiencies are presented as well. Asymptotic normality of the estimators has been treated in a paper of Cui et al. (Biometrika 90 (2003) 953), complementing the results of this paper. Furthermore, a simple explicit version of the projection-pursuit based estimator is proposed and shown to be fast to compute, orthogonally equivariant, and having the maximal finite-sample breakdown point property. We will illustrate the method with a real data example. (c) 2004 Elsevier Inc. All rights reserved.breakdown point; dispersion matrix; influence function; principal components analysis; projection-pursuit; robustness; dispersion matrices; s-estimators; robust; covariance; location; scale;
- …