1 research outputs found
Combinatorial rigidity of Incidence systems and Application to Dictionary learning
Given a hypergraph with hyperedges and a set of \emph{pinning
subspaces}, i.e.\ globally fixed subspaces in Euclidean space , a
\emph{pinned subspace-incidence system} is the pair , with the
constraint that each pinning subspace in is contained in the subspace
spanned by the point realizations in of vertices of the
corresponding hyperedge of . This paper provides a combinatorial
characterization of pinned subspace-incidence systems that are \emph{minimally
rigid}, i.e.\ those systems that are guaranteed to generically yield a locally
unique realization.
Pinned subspace-incidence systems have applications in the \emph{Dictionary
Learning (aka sparse coding)} problem, i.e.\ the problem of obtaining a sparse
representation of a given set of data vectors by learning \emph{dictionary
vectors} upon which the data vectors can be written as sparse linear
combinations. Viewing the dictionary vectors from a geometry perspective as the
spanning set of a subspace arrangement, the result gives a tight bound on the
number of dictionary vectors for sufficiently randomly chosen data vectors, and
gives a way of constructing a dictionary that meets the bound. For less
stringent restrictions on data, but a natural modification of the dictionary
learning problem, a further dictionary learning algorithm is provided. Although
there are recent rigidity based approaches for low rank matrix completion, we
are unaware of prior application of combinatorial rigidity techniques in the
setting of Dictionary Learning. We also provide a systematic classification of
problems related to dictionary learning together with various algorithms, their
assumptions and performance.Comment: arXiv admin note: text overlap with arXiv:1503.01837, arXiv:1402.734