929 research outputs found
Tensor and Matrix Inversions with Applications
Higher order tensor inversion is possible for even order. We have shown that
a tensor group endowed with the Einstein (contracted) product is isomorphic to
the general linear group of degree . With the isomorphic group structures,
we derived new tensor decompositions which we have shown to be related to the
well-known canonical polyadic decomposition and multilinear SVD. Moreover,
within this group structure framework, multilinear systems are derived,
specifically, for solving high dimensional PDEs and large discrete quantum
models. We also address multilinear systems which do not fit the framework in
the least-squares sense, that is, when the tensor has an odd number of modes or
when the tensor has distinct dimensions in each modes. With the notion of
tensor inversion, multilinear systems are solvable. Numerically we solve
multilinear systems using iterative techniques, namely biconjugate gradient and
Jacobi methods in tensor format
Multilinear Time Invariant System Theory
In biological and engineering systems, structure, function and dynamics are
highly coupled. Such interactions can be naturally and compactly captured via
tensor based state space dynamic representations. However, such representations
are not amenable to the standard system and controls framework which requires
the state to be in the form of a vector. In order to address this limitation,
recently a new class of multiway dynamical systems has been introduced in which
the states, inputs and outputs are tensors. We propose a new form of
multilinear time invariant (MLTI) systems based on the Einstein product and
even-order paired tensors. We extend classical linear time invariant (LTI)
system notions including stability, reachability and observability for the new
MLTI system representation by leveraging recent advances in tensor algebra.Comment: 8 pages, SIAM Conference on Control and its Applications 2019,
accepted to appea
A dual framework for low-rank tensor completion
One of the popular approaches for low-rank tensor completion is to use the
latent trace norm regularization. However, most existing works in this
direction learn a sparse combination of tensors. In this work, we fill this gap
by proposing a variant of the latent trace norm that helps in learning a
non-sparse combination of tensors. We develop a dual framework for solving the
low-rank tensor completion problem. We first show a novel characterization of
the dual solution space with an interesting factorization of the optimal
solution. Overall, the optimal solution is shown to lie on a Cartesian product
of Riemannian manifolds. Furthermore, we exploit the versatile Riemannian
optimization framework for proposing computationally efficient trust region
algorithm. The experiments illustrate the efficacy of the proposed algorithm on
several real-world datasets across applications.Comment: Aceepted to appear in Advances of Nueral Information Processing
Systems (NIPS), 2018. A shorter version appeared in the NIPS workshop on
Synergies in Geometric Data Analysis 201
- …