45 research outputs found
Efficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning
Decomposing tensors into orthogonal factors is a well-known task in
statistics, machine learning, and signal processing. We study orthogonal outer
product decompositions where the factors in the summands in the decomposition
are required to be orthogonal across summands, by relating this orthogonal
decomposition to the singular value decompositions of the flattenings. We show
that it is a non-trivial assumption for a tensor to have such an orthogonal
decomposition, and we show that it is unique (up to natural symmetries) in case
it exists, in which case we also demonstrate how it can be efficiently and
reliably obtained by a sequence of singular value decompositions. We
demonstrate how the factoring algorithm can be applied for parameter
identification in latent variable and mixture models
The Algebraic Approach to Phase Retrieval and Explicit Inversion at the Identifiability Threshold
We study phase retrieval from magnitude measurements of an unknown signal as
an algebraic estimation problem. Indeed, phase retrieval from rank-one and more
general linear measurements can be treated in an algebraic way. It is verified
that a certain number of generic rank-one or generic linear measurements are
sufficient to enable signal reconstruction for generic signals, and slightly
more generic measurements yield reconstructability for all signals. Our results
solve a few open problems stated in the recent literature. Furthermore, we show
how the algebraic estimation problem can be solved by a closed-form algebraic
estimation technique, termed ideal regression, providing non-asymptotic success
guarantees
Approximate Rank-Detecting Factorization of Low-Rank Tensors
We present an algorithm, AROFAC2, which detects the (CP-)rank of a degree 3
tensor and calculates its factorization into rank-one components. We provide
generative conditions for the algorithm to work and demonstrate on both
synthetic and real world data that AROFAC2 is a potentially outperforming
alternative to the gold standard PARAFAC over which it has the advantages that
it can intrinsically detect the true rank, avoids spurious components, and is
stable with respect to outliers and non-Gaussian noise
Matroid Regression
We propose an algebraic combinatorial method for solving large sparse linear
systems of equations locally - that is, a method which can compute single
evaluations of the signal without computing the whole signal. The method scales
only in the sparsity of the system and not in its size, and allows to provide
error estimates for any solution method. At the heart of our approach is the
so-called regression matroid, a combinatorial object associated to sparsity
patterns, which allows to replace inversion of the large matrix with the
inversion of a kernel matrix that is constant size. We show that our method
provides the best linear unbiased estimator (BLUE) for this setting and the
minimum variance unbiased estimator (MVUE) under Gaussian noise assumptions,
and furthermore we show that the size of the kernel matrix which is to be
inverted can be traded off with accuracy
Dual-to-kernel learning with ideals
In this paper, we propose a theory which unifies kernel learning and symbolic
algebraic methods. We show that both worlds are inherently dual to each other,
and we use this duality to combine the structure-awareness of algebraic methods
with the efficiency and generality of kernels. The main idea lies in relating
polynomial rings to feature space, and ideals to manifolds, then exploiting
this generative-discriminative duality on kernel matrices. We illustrate this
by proposing two algorithms, IPCA and AVICA, for simultaneous manifold and
feature learning, and test their accuracy on synthetic and real world data.Comment: 15 pages, 1 figur
Algebraic matroids with graph symmetry
This paper studies the properties of two kinds of matroids: (a) algebraic
matroids and (b) finite and infinite matroids whose ground set have some
canonical symmetry, for example row and column symmetry and transposition
symmetry.
For (a) algebraic matroids, we expose cryptomorphisms making them accessible
to techniques from commutative algebra. This allows us to introduce for each
circuit in an algebraic matroid an invariant called circuit polynomial,
generalizing the minimal poly- nomial in classical Galois theory, and studying
the matroid structure with multivariate methods.
For (b) matroids with symmetries we introduce combinatorial invariants
capturing structural properties of the rank function and its limit behavior,
and obtain proofs which are purely combinatorial and do not assume algebraicity
of the matroid; these imply and generalize known results in some specific cases
where the matroid is also algebraic. These results are motivated by, and
readily applicable to framework rigidity, low-rank matrix completion and
determinantal varieties, which lie in the intersection of (a) and (b) where
additional results can be derived. We study the corresponding matroids and
their associated invariants, and for selected cases, we characterize the
matroidal structure and the circuit polynomials completely