10,213 research outputs found

    Decomposability of Tensors

    Get PDF
    Tensor decomposition is a relevant topic, both for theoretical and applied mathematics, due to its interdisciplinary nature, which ranges from multilinear algebra and algebraic geometry to numerical analysis, algebraic statistics, quantum physics, signal processing, artificial intelligence, etc. The starting point behind the study of a decomposition relies on the idea that knowledge of elementary components of a tensor is fundamental to implement procedures that are able to understand and efficiently handle the information that a tensor encodes. Recent advances were obtained with a systematic application of geometric methods: secant varieties, symmetries of special decompositions, and an analysis of the geometry of finite sets. Thanks to new applications of theoretic results, criteria for understanding when a given decomposition is minimal or unique have been introduced or significantly improved. New types of decompositions, whose elementary blocks can be chosen in a range of different possible models (e.g., Chow decompositions or mixed decompositions), are now systematically studied and produce deeper insights into this topic. The aim of this Special Issue is to collect papers that illustrate some directions in which recent researches move, as well as to provide a wide overview of several new approaches to the problem of tensor decomposition

    Efficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning

    Full text link
    Decomposing tensors into orthogonal factors is a well-known task in statistics, machine learning, and signal processing. We study orthogonal outer product decompositions where the factors in the summands in the decomposition are required to be orthogonal across summands, by relating this orthogonal decomposition to the singular value decompositions of the flattenings. We show that it is a non-trivial assumption for a tensor to have such an orthogonal decomposition, and we show that it is unique (up to natural symmetries) in case it exists, in which case we also demonstrate how it can be efficiently and reliably obtained by a sequence of singular value decompositions. We demonstrate how the factoring algorithm can be applied for parameter identification in latent variable and mixture models

    Report on "Geometry and representation theory of tensors for computer science, statistics and other areas."

    Full text link
    This is a technical report on the proceedings of the workshop held July 21 to July 25, 2008 at the American Institute of Mathematics, Palo Alto, California, organized by Joseph Landsberg, Lek-Heng Lim, Jason Morton, and Jerzy Weyman. We include a list of open problems coming from applications in 4 different areas: signal processing, the Mulmuley-Sohoni approach to P vs. NP, matchgates and holographic algorithms, and entanglement and quantum information theory. We emphasize the interactions between geometry and representation theory and these applied areas
    • …
    corecore