8,602 research outputs found

    On uniqueness conditions for Candecomp/Parafac and Indscal with full column rank in one mode

    Get PDF
    AbstractIn the Candecomp/Parafac (CP) model, a three-way array X̲ is written as the sum of R outer vector product arrays and a residual array. The former comprise the columns of the component matrices A, B and C. For fixed residuals, (A,B,C) is unique up to trivial ambiguities, if 2R+2 is less than or equal to the sum of the k-ranks of A, B and C. This classical result was shown by Kruskal in 1977. In this paper, we consider the case where one of A, B, C has full column rank, and show that in this case Kruskal’s uniqueness condition implies a recently obtained uniqueness condition. Moreover, we obtain Kruskal-type uniqueness conditions that are weaker than Kruskal’s condition itself. Also, for (A,B,C) with rank(A)=R-1 and C full column rank, we obtain easy-to-check necessary and sufficient uniqueness conditions. We extend our results to the Indscal decomposition in which the array X̲ has symmetric slices and A=B is imposed. We consider the real-valued CP and Indscal decompositions, but our results are also valid for their complex-valued counterparts

    Overview of Constrained PARAFAC Models

    Get PDF
    In this paper, we present an overview of constrained PARAFAC models where the constraints model linear dependencies among columns of the factor matrices of the tensor decomposition, or alternatively, the pattern of interactions between different modes of the tensor which are captured by the equivalent core tensor. Some tensor prerequisites with a particular emphasis on mode combination using Kronecker products of canonical vectors that makes easier matricization operations, are first introduced. This Kronecker product based approach is also formulated in terms of the index notation, which provides an original and concise formalism for both matricizing tensors and writing tensor models. Then, after a brief reminder of PARAFAC and Tucker models, two families of constrained tensor models, the co-called PARALIND/CONFAC and PARATUCK models, are described in a unified framework, for NthN^{th} order tensors. New tensor models, called nested Tucker models and block PARALIND/CONFAC models, are also introduced. A link between PARATUCK models and constrained PARAFAC models is then established. Finally, new uniqueness properties of PARATUCK models are deduced from sufficient conditions for essential uniqueness of their associated constrained PARAFAC models

    Report on "Geometry and representation theory of tensors for computer science, statistics and other areas."

    Full text link
    This is a technical report on the proceedings of the workshop held July 21 to July 25, 2008 at the American Institute of Mathematics, Palo Alto, California, organized by Joseph Landsberg, Lek-Heng Lim, Jason Morton, and Jerzy Weyman. We include a list of open problems coming from applications in 4 different areas: signal processing, the Mulmuley-Sohoni approach to P vs. NP, matchgates and holographic algorithms, and entanglement and quantum information theory. We emphasize the interactions between geometry and representation theory and these applied areas

    Tensor and Matrix Inversions with Applications

    Full text link
    Higher order tensor inversion is possible for even order. We have shown that a tensor group endowed with the Einstein (contracted) product is isomorphic to the general linear group of degree nn. With the isomorphic group structures, we derived new tensor decompositions which we have shown to be related to the well-known canonical polyadic decomposition and multilinear SVD. Moreover, within this group structure framework, multilinear systems are derived, specifically, for solving high dimensional PDEs and large discrete quantum models. We also address multilinear systems which do not fit the framework in the least-squares sense, that is, when the tensor has an odd number of modes or when the tensor has distinct dimensions in each modes. With the notion of tensor inversion, multilinear systems are solvable. Numerically we solve multilinear systems using iterative techniques, namely biconjugate gradient and Jacobi methods in tensor format

    Iterative Methods for Symmetric Outer Product Tensor Decompositions

    Full text link
    We study the symmetric outer product decomposition which decomposes a fully (partially) symmetric tensor into a sum of rank-one fully (partially) symmetric tensors. We present iterative algorithms for the third-order partially symmetric tensor and fourth-order fully symmetric tensor. The numerical examples indicate a faster convergence rate for the new algorithms than the standard method of alternating least squares
    • …
    corecore