20,282 research outputs found

    Efficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning

    Full text link
    Decomposing tensors into orthogonal factors is a well-known task in statistics, machine learning, and signal processing. We study orthogonal outer product decompositions where the factors in the summands in the decomposition are required to be orthogonal across summands, by relating this orthogonal decomposition to the singular value decompositions of the flattenings. We show that it is a non-trivial assumption for a tensor to have such an orthogonal decomposition, and we show that it is unique (up to natural symmetries) in case it exists, in which case we also demonstrate how it can be efficiently and reliably obtained by a sequence of singular value decompositions. We demonstrate how the factoring algorithm can be applied for parameter identification in latent variable and mixture models

    From exceptional collections to motivic decompositions via noncommutative motives

    Get PDF
    Making use of noncommutative motives we relate exceptional collections (and more generally semi-orthogonal decompositions) to motivic decompositions. On one hand we prove that the Chow motive M(X) of every smooth proper Deligne-Mumford stack X, whose bounded derived category D(X) of coherent schemes admits a full exceptional collection, decomposes into a direct sum of tensor powers of the Lefschetz motive. Examples include projective spaces, quadrics, toric varieties, homogeneous spaces, Fano threefolds, and moduli spaces. On the other hand we prove that if M(X) decomposes into a direct sum of tensor powers of the Lefschetz motive and moreover D(X) admits a semi-orthogonal decomposition, then the noncommutative motive of each one of the pieces of the semi-orthogonal decomposition is a direct sum of the tensor unit. As an application we obtain a simplification of Dubrovin's conjecture.Comment: 14 pages; revised versio

    Empirical Evaluation of Four Tensor Decomposition Algorithms

    Get PDF
    Higher-order tensor decompositions are analogous to the familiar Singular Value Decomposition (SVD), but they transcend the limitations of matrices (second-order tensors). SVD is a powerful tool that has achieved impressive results in information retrieval, collaborative filtering, computational linguistics, computational vision, and other fields. However, SVD is limited to two-dimensional arrays of data (two modes), and many potential applications have three or more modes, which require higher-order tensor decompositions. This paper evaluates four algorithms for higher-order tensor decomposition: Higher-Order Singular Value Decomposition (HO-SVD), Higher-Order Orthogonal Iteration (HOOI), Slice Projection (SP), and Multislice Projection (MP). We measure the time (elapsed run time), space (RAM and disk space requirements), and fit (tensor reconstruction accuracy) of the four algorithms, under a variety of conditions. We find that standard implementations of HO-SVD and HOOI do not scale up to larger tensors, due to increasing RAM requirements. We recommend HOOI for tensors that are small enough for the available RAM and MP for larger tensors

    Orthogonal tensor decompositions

    Full text link

    Orthogonal Tensor Decompositions

    Full text link

    Rank properties and computational methods for orthogonal tensor decompositions

    Full text link
    The orthogonal decomposition factorizes a tensor into a sum of an orthogonal list of rankone tensors. We present several properties of orthogonal rank. We find that a subtensor may have a larger orthogonal rank than the whole tensor and prove the lower semicontinuity of orthogonal rank. The lower semicontinuity guarantees the existence of low orthogonal rank approximation. To fit the orthogonal decomposition, we propose an algorithm based on the augmented Lagrangian method and guarantee the orthogonality by a novel orthogonalization procedure. Numerical experiments show that the proposed method has a great advantage over the existing methods for strongly orthogonal decompositions in terms of the approximation error.Comment: 19 pages, 2 figures, 3 table

    From exceptional collections to motivic decompositions via noncommutative motives

    Get PDF
    Making use of noncommutative motives we relate exceptional collections (and more generally semi-orthogonal decompositions) to motivic decompositions. On one hand we prove that the Chow motive M(X)_Q of every smooth and proper Deligne–Mumford stack X, whose bounded derived category D^b(X) of coherent schemes admits a full exceptional collection, decomposes into a direct sum of tensor powers of the Lefschetz motive. Examples include projective spaces, quadrics, toric varieties, homogeneous spaces, Fano threefolds, and moduli spaces. On the other hand we prove that if M(X)_Q decomposes into a direct sum of tensor powers of the Lefschetz motive and moreover D^b(X) admits a semiorthogonal decomposition, then the noncommutative motive of each one of the pieces of the semi-orthogonal decomposition is a direct sum of ⊗-units. As an application we obtain a simplification of Dubrovin’s conjecture
    • …
    corecore