19 research outputs found
Max vs Min: Tensor Decomposition and ICA with nearly Linear Sample Complexity
We present a simple, general technique for reducing the sample complexity of
matrix and tensor decomposition algorithms applied to distributions. We use the
technique to give a polynomial-time algorithm for standard ICA with sample
complexity nearly linear in the dimension, thereby improving substantially on
previous bounds. The analysis is based on properties of random polynomials,
namely the spacings of an ensemble of polynomials. Our technique also applies
to other applications of tensor decompositions, including spherical Gaussian
mixture models
On Measure Transformed Canonical Correlation Analysis
In this paper linear canonical correlation analysis (LCCA) is generalized by
applying a structured transform to the joint probability distribution of the
considered pair of random vectors, i.e., a transformation of the joint
probability measure defined on their joint observation space. This framework,
called measure transformed canonical correlation analysis (MTCCA), applies LCCA
to the data after transformation of the joint probability measure. We show that
judicious choice of the transform leads to a modified canonical correlation
analysis, which, in contrast to LCCA, is capable of detecting non-linear
relationships between the considered pair of random vectors. Unlike kernel
canonical correlation analysis, where the transformation is applied to the
random vectors, in MTCCA the transformation is applied to their joint
probability distribution. This results in performance advantages and reduced
implementation complexity. The proposed approach is illustrated for graphical
model selection in simulated data having non-linear dependencies, and for
measuring long-term associations between companies traded in the NASDAQ and
NYSE stock markets
Heavy-tailed Independent Component Analysis
Independent component analysis (ICA) is the problem of efficiently recovering
a matrix from i.i.d. observations of
where is a random vector with mutually independent
coordinates. This problem has been intensively studied, but all existing
efficient algorithms with provable guarantees require that the coordinates
have finite fourth moments. We consider the heavy-tailed ICA problem
where we do not make this assumption, about the second moment. This problem
also has received considerable attention in the applied literature. In the
present work, we first give a provably efficient algorithm that works under the
assumption that for constant , each has finite
-moment, thus substantially weakening the moment requirement
condition for the ICA problem to be solvable. We then give an algorithm that
works under the assumption that matrix has orthogonal columns but requires
no moment assumptions. Our techniques draw ideas from convex geometry and
exploit standard properties of the multivariate spherical Gaussian distribution
in a novel way.Comment: 30 page
Determining When an Algebra Is an Evolution Algebra
Distinguished Visitor Grant of the School of Mathematics and Statistics, University College DublinMTM216-76327-C3-2-