692 research outputs found

    Decomposing tensors with structured matrix factors reduces to rank-1 approximations

    Get PDF
    International audienceTensor decompositions permit to estimate in a deterministic way the parameters in a multi-linear model. Applications have been already pointed out in antenna array processing and digital communications, among others, and are extremely attractive provided some diversity at the receiver is available. As opposed to the widely used ALS algorithm, non-iterative algorithms are proposed in this paper to compute the required tensor decomposition into a sum of rank-1 terms, when some factor matrices enjoy some structure, such as block-Hankel, triangular, band, etc

    Decomposing tensors with structured matrix factors reduces to rank-1 approximations

    Full text link

    A constructive arbitrary-degree Kronecker product decomposition of tensors

    Get PDF
    We propose the tensor Kronecker product singular value decomposition~(TKPSVD) that decomposes a real kk-way tensor A\mathcal{A} into a linear combination of tensor Kronecker products with an arbitrary number of dd factors A=∑j=1Rσj Aj(d)⊗⋯⊗Aj(1)\mathcal{A} = \sum_{j=1}^R \sigma_j\, \mathcal{A}^{(d)}_j \otimes \cdots \otimes \mathcal{A}^{(1)}_j. We generalize the matrix Kronecker product to tensors such that each factor Aj(i)\mathcal{A}^{(i)}_j in the TKPSVD is a kk-way tensor. The algorithm relies on reshaping and permuting the original tensor into a dd-way tensor, after which a polyadic decomposition with orthogonal rank-1 terms is computed. We prove that for many different structured tensors, the Kronecker product factors Aj(1),…,Aj(d)\mathcal{A}^{(1)}_j,\ldots,\mathcal{A}^{(d)}_j are guaranteed to inherit this structure. In addition, we introduce the new notion of general symmetric tensors, which includes many different structures such as symmetric, persymmetric, centrosymmetric, Toeplitz and Hankel tensors.Comment: Rewrote the paper completely and generalized everything to tensor

    Network Sketching: Exploiting Binary Structure in Deep CNNs

    Full text link
    Convolutional neural networks (CNNs) with deep architectures have substantially advanced the state-of-the-art in computer vision tasks. However, deep networks are typically resource-intensive and thus difficult to be deployed on mobile devices. Recently, CNNs with binary weights have shown compelling efficiency to the community, whereas the accuracy of such models is usually unsatisfactory in practice. In this paper, we introduce network sketching as a novel technique of pursuing binary-weight CNNs, targeting at more faithful inference and better trade-off for practical applications. Our basic idea is to exploit binary structure directly in pre-trained filter banks and produce binary-weight models via tensor expansion. The whole process can be treated as a coarse-to-fine model approximation, akin to the pencil drawing steps of outlining and shading. To further speedup the generated models, namely the sketches, we also propose an associative implementation of binary tensor convolutions. Experimental results demonstrate that a proper sketch of AlexNet (or ResNet) outperforms the existing binary-weight models by large margins on the ImageNet large scale classification task, while the committed memory for network parameters only exceeds a little.Comment: To appear in CVPR201

    A Tour of Constrained Tensor Canonical Polyadic Decomposition

    No full text
    This paper surveys the use of constraints in tensor decomposition models. Constrained tensor decompositions have been extensively applied to chemometrics and array processing, but there is a growing interest in understanding these methods independently of the application of interest. We suggest a formalism that unifies various instances of constrained tensor decomposition, while shedding light on some possible extensions of existing methods
    • …
    corecore