526,162 research outputs found

    Randomized Dimension Reduction on Massive Data

    Full text link
    Scalability of statistical estimators is of increasing importance in modern applications and dimension reduction is often used to extract relevant information from data. A variety of popular dimension reduction approaches can be framed as symmetric generalized eigendecomposition problems. In this paper we outline how taking into account the low rank structure assumption implicit in these dimension reduction approaches provides both computational and statistical advantages. We adapt recent randomized low-rank approximation algorithms to provide efficient solutions to three dimension reduction methods: Principal Component Analysis (PCA), Sliced Inverse Regression (SIR), and Localized Sliced Inverse Regression (LSIR). A key observation in this paper is that randomization serves a dual role, improving both computational and statistical performance. This point is highlighted in our experiments on real and simulated data.Comment: 31 pages, 6 figures, Key Words:dimension reduction, generalized eigendecompositon, low-rank, supervised, inverse regression, random projections, randomized algorithms, Krylov subspace method

    Dense PGL-orbits in products of Grassmannians

    Full text link
    In this paper, we find some necessary and sufficient conditions on the dimension vector d=(d1,...,dk;n)\underline{\bf{d}} = (d_1,..., d_k; n) so that the diagonal action of PGL(n)\mathbb{P}GL(n) on i=1kGr(di;n)\prod_{i=1}^k Gr(d_i;n) has a dense orbit. Consequently, we obtain some algorithms for finding dense and sparse dimension vectors and classify dense dimension vectors with small length or size. We also characterize the dense dimension vectors of the form (d,d,...,d;n)(d,d,..., d; n).Comment: 21 page
    corecore