692 research outputs found
Decomposing tensors with structured matrix factors reduces to rank-1 approximations
International audienceTensor decompositions permit to estimate in a deterministic way the parameters in a multi-linear model. Applications have been already pointed out in antenna array processing and digital communications, among others, and are extremely attractive provided some diversity at the receiver is available. As opposed to the widely used ALS algorithm, non-iterative algorithms are proposed in this paper to compute the required tensor decomposition into a sum of rank-1 terms, when some factor matrices enjoy some structure, such as block-Hankel, triangular, band, etc
A constructive arbitrary-degree Kronecker product decomposition of tensors
We propose the tensor Kronecker product singular value decomposition~(TKPSVD)
that decomposes a real -way tensor into a linear combination
of tensor Kronecker products with an arbitrary number of factors
. We generalize the matrix Kronecker product to
tensors such that each factor in the TKPSVD is a -way
tensor. The algorithm relies on reshaping and permuting the original tensor
into a -way tensor, after which a polyadic decomposition with orthogonal
rank-1 terms is computed. We prove that for many different structured tensors,
the Kronecker product factors
are guaranteed to inherit this structure. In addition, we introduce the new
notion of general symmetric tensors, which includes many different structures
such as symmetric, persymmetric, centrosymmetric, Toeplitz and Hankel tensors.Comment: Rewrote the paper completely and generalized everything to tensor
Network Sketching: Exploiting Binary Structure in Deep CNNs
Convolutional neural networks (CNNs) with deep architectures have
substantially advanced the state-of-the-art in computer vision tasks. However,
deep networks are typically resource-intensive and thus difficult to be
deployed on mobile devices. Recently, CNNs with binary weights have shown
compelling efficiency to the community, whereas the accuracy of such models is
usually unsatisfactory in practice. In this paper, we introduce network
sketching as a novel technique of pursuing binary-weight CNNs, targeting at
more faithful inference and better trade-off for practical applications. Our
basic idea is to exploit binary structure directly in pre-trained filter banks
and produce binary-weight models via tensor expansion. The whole process can be
treated as a coarse-to-fine model approximation, akin to the pencil drawing
steps of outlining and shading. To further speedup the generated models, namely
the sketches, we also propose an associative implementation of binary tensor
convolutions. Experimental results demonstrate that a proper sketch of AlexNet
(or ResNet) outperforms the existing binary-weight models by large margins on
the ImageNet large scale classification task, while the committed memory for
network parameters only exceeds a little.Comment: To appear in CVPR201
A Tour of Constrained Tensor Canonical Polyadic Decomposition
This paper surveys the use of constraints in tensor decomposition models. Constrained tensor decompositions have been extensively applied to chemometrics and array processing, but there is a growing interest in understanding these methods independently of the application of interest. We suggest a formalism that unifies various instances of constrained tensor decomposition, while shedding light on some possible extensions of existing methods
- …