36,015 research outputs found

    Efficient independent component analysis

    Full text link
    Independent component analysis (ICA) has been widely used for blind source separation in many fields such as brain imaging analysis, signal processing and telecommunication. Many statistical techniques based on M-estimates have been proposed for estimating the mixing matrix. Recently, several nonparametric methods have been developed, but in-depth analysis of asymptotic efficiency has not been available. We analyze ICA using semiparametric theories and propose a straightforward estimate based on the efficient score function by using B-spline approximations. The estimate is asymptotically efficient under moderate conditions and exhibits better performance than standard ICA methods in a variety of simulations.Comment: Published at http://dx.doi.org/10.1214/009053606000000939 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Heavy-tailed Independent Component Analysis

    Full text link
    Independent component analysis (ICA) is the problem of efficiently recovering a matrix A∈Rn×nA \in \mathbb{R}^{n\times n} from i.i.d. observations of X=ASX=AS where S∈RnS \in \mathbb{R}^n is a random vector with mutually independent coordinates. This problem has been intensively studied, but all existing efficient algorithms with provable guarantees require that the coordinates SiS_i have finite fourth moments. We consider the heavy-tailed ICA problem where we do not make this assumption, about the second moment. This problem also has received considerable attention in the applied literature. In the present work, we first give a provably efficient algorithm that works under the assumption that for constant γ>0\gamma > 0, each SiS_i has finite (1+γ)(1+\gamma)-moment, thus substantially weakening the moment requirement condition for the ICA problem to be solvable. We then give an algorithm that works under the assumption that matrix AA has orthogonal columns but requires no moment assumptions. Our techniques draw ideas from convex geometry and exploit standard properties of the multivariate spherical Gaussian distribution in a novel way.Comment: 30 page

    Fourth Moments and Independent Component Analysis

    Full text link
    In independent component analysis it is assumed that the components of the observed random vector are linear combinations of latent independent random variables, and the aim is then to find an estimate for a transformation matrix back to these independent components. In the engineering literature, there are several traditional estimation procedures based on the use of fourth moments, such as FOBI (fourth order blind identification), JADE (joint approximate diagonalization of eigenmatrices), and FastICA, but the statistical properties of these estimates are not well known. In this paper various independent component functionals based on the fourth moments are discussed in detail, starting with the corresponding optimization problems, deriving the estimating equations and estimation algorithms, and finding asymptotic statistical properties of the estimates. Comparisons of the asymptotic variances of the estimates in wide independent component models show that in most cases JADE and the symmetric version of FastICA perform better than their competitors.Comment: Published at http://dx.doi.org/10.1214/15-STS520 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Independent Component Analysis of Spatiotemporal Chaos

    Full text link
    Two types of spatiotemporal chaos exhibited by ensembles of coupled nonlinear oscillators are analyzed using independent component analysis (ICA). For diffusively coupled complex Ginzburg-Landau oscillators that exhibit smooth amplitude patterns, ICA extracts localized one-humped basis vectors that reflect the characteristic hole structures of the system, and for nonlocally coupled complex Ginzburg-Landau oscillators with fractal amplitude patterns, ICA extracts localized basis vectors with characteristic gap structures. Statistics of the decomposed signals also provide insight into the complex dynamics of the spatiotemporal chaos.Comment: 5 pages, 6 figures, JPSJ Vol 74, No.

    Quantifying identifiability in independent component analysis

    Get PDF
    We are interested in consistent estimation of the mixing matrix in the ICA model, when the error distribution is close to (but different from) Gaussian. In particular, we consider nn independent samples from the ICA model X=AϵX = A\epsilon, where we assume that the coordinates of ϵ\epsilon are independent and identically distributed according to a contaminated Gaussian distribution, and the amount of contamination is allowed to depend on nn. We then investigate how the ability to consistently estimate the mixing matrix depends on the amount of contamination. Our results suggest that in an asymptotic sense, if the amount of contamination decreases at rate 1/n1/\sqrt{n} or faster, then the mixing matrix is only identifiable up to transpose products. These results also have implications for causal inference from linear structural equation models with near-Gaussian additive noise.Comment: 22 pages, 2 figure

    Statistical physics of independent component analysis

    Full text link
    Statistical physics is used to investigate independent component analysis with polynomial contrast functions. While the replica method fails, an adapted cavity approach yields valid results. The learning curves, obtained in a suitable thermodynamic limit, display a first order phase transition from poor to perfect generalization.Comment: 7 pages, 1 figure, to appear in Europhys. Lett
    • …
    corecore