305 research outputs found
Overview of Constrained PARAFAC Models
In this paper, we present an overview of constrained PARAFAC models where the
constraints model linear dependencies among columns of the factor matrices of
the tensor decomposition, or alternatively, the pattern of interactions between
different modes of the tensor which are captured by the equivalent core tensor.
Some tensor prerequisites with a particular emphasis on mode combination using
Kronecker products of canonical vectors that makes easier matricization
operations, are first introduced. This Kronecker product based approach is also
formulated in terms of the index notation, which provides an original and
concise formalism for both matricizing tensors and writing tensor models. Then,
after a brief reminder of PARAFAC and Tucker models, two families of
constrained tensor models, the co-called PARALIND/CONFAC and PARATUCK models,
are described in a unified framework, for order tensors. New tensor
models, called nested Tucker models and block PARALIND/CONFAC models, are also
introduced. A link between PARATUCK models and constrained PARAFAC models is
then established. Finally, new uniqueness properties of PARATUCK models are
deduced from sufficient conditions for essential uniqueness of their associated
constrained PARAFAC models
Tensor Decompositions for Signal Processing Applications From Two-way to Multiway Component Analysis
The widespread use of multi-sensor technology and the emergence of big
datasets has highlighted the limitations of standard flat-view matrix models
and the necessity to move towards more versatile data analysis tools. We show
that higher-order tensors (i.e., multiway arrays) enable such a fundamental
paradigm shift towards models that are essentially polynomial and whose
uniqueness, unlike the matrix methods, is guaranteed under verymild and natural
conditions. Benefiting fromthe power ofmultilinear algebra as theirmathematical
backbone, data analysis techniques using tensor decompositions are shown to
have great flexibility in the choice of constraints that match data properties,
and to find more general latent components in the data than matrix-based
methods. A comprehensive introduction to tensor decompositions is provided from
a signal processing perspective, starting from the algebraic foundations, via
basic Canonical Polyadic and Tucker models, through to advanced cause-effect
and multi-view data analysis schemes. We show that tensor decompositions enable
natural generalizations of some commonly used signal processing paradigms, such
as canonical correlation and subspace techniques, signal separation, linear
regression, feature extraction and classification. We also cover computational
aspects, and point out how ideas from compressed sensing and scientific
computing may be used for addressing the otherwise unmanageable storage and
manipulation problems associated with big datasets. The concepts are supported
by illustrative real world case studies illuminating the benefits of the tensor
framework, as efficient and promising tools for modern signal processing, data
analysis and machine learning applications; these benefits also extend to
vector/matrix data through tensorization. Keywords: ICA, NMF, CPD, Tucker
decomposition, HOSVD, tensor networks, Tensor Train
Binary component decomposition. Part II: The asymmetric case
This paper studies the problem of decomposing a low-rank matrix into a factor with binary entries, either from {±1} or from {0,1}, and an unconstrained factor. The research answers fundamental questions about the existence and uniqueness of these decompositions. It also leads to tractable factorization algorithms that succeed under a mild deterministic condition. This work builds on a companion paper that addresses the related problem of decomposing a low-rank positive-semidefinite matrix into symmetric binary factors
A Tour of Constrained Tensor Canonical Polyadic Decomposition
This paper surveys the use of constraints in tensor decomposition models. Constrained tensor decompositions have been extensively applied to chemometrics and array processing, but there is a growing interest in understanding these methods independently of the application of interest. We suggest a formalism that unifies various instances of constrained tensor decomposition, while shedding light on some possible extensions of existing methods
- …