103 research outputs found

    Fast Dimension Reduction Using Rademacher Series on Dual BCH Codes

    Full text link

    Tighter Bounds on Johnson Lindenstrauss Transforms

    Get PDF
    Johnson and Lindenstrauss (1984) proved that any ļ¬nite set of data in a high dimensional space can be projected into a low dimensional space with the Euclidean metric information of the set being preserved within any desired accuracy. Such dimension reduction plays a critical role in many applications with massive data. There has been extensive effort in the literature on how to ļ¬nd explicit constructions of Johnson-Lindenstrauss projections. In this poster, we show how algebraic codes over ļ¬nite ļ¬elds can be used for fast Johnson-Lindenstrauss projections of data in high dimensional Euclidean spaces

    Johnson-Lindenstrauss projection of high dimensional data

    Get PDF
    Johnson and Lindenstrauss (1984) proved that any finite set of data in a high dimensional space can be projected into a low dimensional space with the Euclidean metric information of the set being preserved within any desired accuracy. Such dimension reduction plays a critical role in many applications with massive data. There have been extensive effort in the literature on how to find explicit constructions of Johnson-Lindenstrauss projections. In this poster, we show how algebraic codes over finite fields can be used for fast Johnson-Lindenstrauss projections of data in high dimensional Euclidean spaces. This is joint work with Shuhong Gao and Yue Mao

    Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform

    Full text link
    The problems of random projections and sparse reconstruction have much in common and individually received much attention. Surprisingly, until now they progressed in parallel and remained mostly separate. Here, we employ new tools from probability in Banach spaces that were successfully used in the context of sparse reconstruction to advance on an open problem in random pojection. In particular, we generalize and use an intricate result by Rudelson and Vershynin for sparse reconstruction which uses Dudley's theorem for bounding Gaussian processes. Our main result states that any set of N=expā”(O~(n))N = \exp(\tilde{O}(n)) real vectors in nn dimensional space can be linearly mapped to a space of dimension k=O(\log N\polylog(n)), while (1) preserving the pairwise distances among the vectors to within any constant distortion and (2) being able to apply the transformation in time O(nlogā”n)O(n\log n) on each vector. This improves on the best known N=expā”(O~(n1/2))N = \exp(\tilde{O}(n^{1/2})) achieved by Ailon and Liberty and N=expā”(O~(n1/3))N = \exp(\tilde{O}(n^{1/3})) by Ailon and Chazelle. The dependence in the distortion constant however is believed to be suboptimal and subject to further investigation. For constant distortion, this settles the open question posed by these authors up to a \polylog(n) factor while considerably simplifying their constructions
    • ā€¦
    corecore