2,842 research outputs found
Multilinear Time Invariant System Theory
In biological and engineering systems, structure, function and dynamics are
highly coupled. Such interactions can be naturally and compactly captured via
tensor based state space dynamic representations. However, such representations
are not amenable to the standard system and controls framework which requires
the state to be in the form of a vector. In order to address this limitation,
recently a new class of multiway dynamical systems has been introduced in which
the states, inputs and outputs are tensors. We propose a new form of
multilinear time invariant (MLTI) systems based on the Einstein product and
even-order paired tensors. We extend classical linear time invariant (LTI)
system notions including stability, reachability and observability for the new
MLTI system representation by leveraging recent advances in tensor algebra.Comment: 8 pages, SIAM Conference on Control and its Applications 2019,
accepted to appea
Tensor and Matrix Inversions with Applications
Higher order tensor inversion is possible for even order. We have shown that
a tensor group endowed with the Einstein (contracted) product is isomorphic to
the general linear group of degree . With the isomorphic group structures,
we derived new tensor decompositions which we have shown to be related to the
well-known canonical polyadic decomposition and multilinear SVD. Moreover,
within this group structure framework, multilinear systems are derived,
specifically, for solving high dimensional PDEs and large discrete quantum
models. We also address multilinear systems which do not fit the framework in
the least-squares sense, that is, when the tensor has an odd number of modes or
when the tensor has distinct dimensions in each modes. With the notion of
tensor inversion, multilinear systems are solvable. Numerically we solve
multilinear systems using iterative techniques, namely biconjugate gradient and
Jacobi methods in tensor format
Bayesian Robust Tensor Factorization for Incomplete Multiway Data
We propose a generative model for robust tensor factorization in the presence
of both missing data and outliers. The objective is to explicitly infer the
underlying low-CP-rank tensor capturing the global information and a sparse
tensor capturing the local information (also considered as outliers), thus
providing the robust predictive distribution over missing entries. The
low-CP-rank tensor is modeled by multilinear interactions between multiple
latent factors on which the column sparsity is enforced by a hierarchical
prior, while the sparse tensor is modeled by a hierarchical view of Student-
distribution that associates an individual hyperparameter with each element
independently. For model learning, we develop an efficient closed-form
variational inference under a fully Bayesian treatment, which can effectively
prevent the overfitting problem and scales linearly with data size. In contrast
to existing related works, our method can perform model selection automatically
and implicitly without need of tuning parameters. More specifically, it can
discover the groundtruth of CP rank and automatically adapt the sparsity
inducing priors to various types of outliers. In addition, the tradeoff between
the low-rank approximation and the sparse representation can be optimized in
the sense of maximum model evidence. The extensive experiments and comparisons
with many state-of-the-art algorithms on both synthetic and real-world datasets
demonstrate the superiorities of our method from several perspectives.Comment: in IEEE Transactions on Neural Networks and Learning Systems, 201
Multilinear tensor regression for longitudinal relational data
A fundamental aspect of relational data, such as from a social network, is
the possibility of dependence among the relations. In particular, the relations
between members of one pair of nodes may have an effect on the relations
between members of another pair. This article develops a type of regression
model to estimate such effects in the context of longitudinal and multivariate
relational data, or other data that can be represented in the form of a tensor.
The model is based on a general multilinear tensor regression model, a special
case of which is a tensor autoregression model in which the tensor of relations
at one time point are parsimoniously regressed on relations from previous time
points. This is done via a separable, or Kronecker-structured, regression
parameter along with a separable covariance model. In the context of an
analysis of longitudinal multivariate relational data, it is shown how the
multilinear tensor regression model can represent patterns that often appear in
relational and network data, such as reciprocity and transitivity.Comment: Published at http://dx.doi.org/10.1214/15-AOAS839 in the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
- …