4 research outputs found

    Polynomial Time and Sample Complexity for Non-Gaussian Component Analysis: Spectral Methods

    Full text link
    The problem of Non-Gaussian Component Analysis (NGCA) is about finding a maximal low-dimensional subspace EE in Rn\mathbb{R}^n so that data points projected onto EE follow a non-gaussian distribution. Although this is an appropriate model for some real world data analysis problems, there has been little progress on this problem over the last decade. In this paper, we attempt to address this state of affairs in two ways. First, we give a new characterization of standard gaussian distributions in high-dimensions, which lead to effective tests for non-gaussianness. Second, we propose a simple algorithm, \emph{Reweighted PCA}, as a method for solving the NGCA problem. We prove that for a general unknown non-gaussian distribution, this algorithm recovers at least one direction in EE, with sample and time complexity depending polynomially on the dimension of the ambient space. We conjecture that the algorithm actually recovers the entire EE

    Non-Gaussian Component Analysis using Entropy Methods

    Full text link
    Non-Gaussian component analysis (NGCA) is a problem in multidimensional data analysis which, since its formulation in 2006, has attracted considerable attention in statistics and machine learning. In this problem, we have a random variable XX in nn-dimensional Euclidean space. There is an unknown subspace Γ\Gamma of the nn-dimensional Euclidean space such that the orthogonal projection of XX onto Γ\Gamma is standard multidimensional Gaussian and the orthogonal projection of XX onto Γ⊥\Gamma^{\perp}, the orthogonal complement of Γ\Gamma, is non-Gaussian, in the sense that all its one-dimensional marginals are different from the Gaussian in a certain metric defined in terms of moments. The NGCA problem is to approximate the non-Gaussian subspace Γ⊥\Gamma^{\perp} given samples of XX. Vectors in Γ⊥\Gamma^{\perp} correspond to `interesting' directions, whereas vectors in Γ\Gamma correspond to the directions where data is very noisy. The most interesting applications of the NGCA model is for the case when the magnitude of the noise is comparable to that of the true signal, a setting in which traditional noise reduction techniques such as PCA don't apply directly. NGCA is also related to dimension reduction and to other data analysis problems such as ICA. NGCA-like problems have been studied in statistics for a long time using techniques such as projection pursuit. We give an algorithm that takes polynomial time in the dimension nn and has an inverse polynomial dependence on the error parameter measuring the angle distance between the non-Gaussian subspace and the subspace output by the algorithm. Our algorithm is based on relative entropy as the contrast function and fits under the projection pursuit framework. The techniques we develop for analyzing our algorithm maybe of use for other related problems

    Some Algorithms and Paradigms for Big Data

    Full text link
    The reality of big data poses both opportunities and challenges to modern researchers. Its key features -- large sample sizes, high-dimensional feature spaces, and structural complexity -- enforce new paradigms upon the creation of effective yet algorithmic efficient data analysis algorithms. In this dissertation, we illustrate a few paradigms through the analysis of three new algorithms. The first two algorithms consider the problem of phase retrieval, in which we seek to recover a signal from random rank-one quadratic measurements. We first show that an adaptation of the randomized Kaczmarz method provably exhibits linear convergence so long as our sample size is linear in the signal dimension. Next, we show that the standard SDP relaxation of sparse PCA yields an algorithm that does signal recovery for sparse, model-misspecified phase retrieval with a sample complexity that scales according to the square of the sparsity parameter. Finally, our third algorithm addresses the problem of Non-Gaussian Component Analysis, in which we are trying to identify the non-Gaussian marginals of a high-dimensional distribution. We prove that our algorithm exhibits polynomial time convergence with polynomial sample complexity.PHDMathematicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/145895/1/yanshuo_1.pd

    Estimating Non-Gaussian Subspaces by Characteristic Functions

    No full text
    In this article, we consider high-dimensional data which contains a low-dimensional non-Gaussian structure contaminated with Gaussian noise and propose a new method to identify the non-Gaussian subspace. A linear dimension reduction algorithm based on the fourth-order cumulant tensor was proposed in our previous work [4]. Although it works well for sub-Gaussian structures, the performance is not satisfactory for super-Gaussian data due to outliers. To overcome this problem, we construct an alternative by using Hessian of characteristic functions which was applied to (multidimensional) independent component analysis [10,11]. A numerical study demonstrates the validity of our method
    corecore