7,487 research outputs found

    Randomized Tensor Ring Decomposition and Its Application to Large-scale Data Reconstruction

    Full text link
    Dimensionality reduction is an essential technique for multi-way large-scale data, i.e., tensor. Tensor ring (TR) decomposition has become popular due to its high representation ability and flexibility. However, the traditional TR decomposition algorithms suffer from high computational cost when facing large-scale data. In this paper, taking advantages of the recently proposed tensor random projection method, we propose two TR decomposition algorithms. By employing random projection on every mode of the large-scale tensor, the TR decomposition can be processed at a much smaller scale. The simulation experiment shows that the proposed algorithms are 4βˆ’254-25 times faster than traditional algorithms without loss of accuracy, and our algorithms show superior performance in deep learning dataset compression and hyperspectral image reconstruction experiments compared to other randomized algorithms.Comment: ICASSP submissio

    Fast and Guaranteed Tensor Decomposition via Sketching

    Get PDF
    Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in statistical learning of latent variable models and in data mining. In this paper, we propose fast and randomized tensor CP decomposition algorithms based on sketching. We build on the idea of count sketches, but introduce many novel ideas which are unique to tensors. We develop novel methods for randomized computation of tensor contractions via FFTs, without explicitly forming the tensors. Such tensor contractions are encountered in decomposition methods such as tensor power iterations and alternating least squares. We also design novel colliding hashes for symmetric tensors to further save time in computing the sketches. We then combine these sketching ideas with existing whitening and tensor power iterative techniques to obtain the fastest algorithm on both sparse and dense tensors. The quality of approximation under our method does not depend on properties such as sparsity, uniformity of elements, etc. We apply the method for topic modeling and obtain competitive results.Comment: 29 pages. Appeared in Proceedings of Advances in Neural Information Processing Systems (NIPS), held at Montreal, Canada in 201
    • …
    corecore