1,299 research outputs found
Graph Signal Processing: Overview, Challenges and Applications
Research in Graph Signal Processing (GSP) aims to develop tools for
processing data defined on irregular graph domains. In this paper we first
provide an overview of core ideas in GSP and their connection to conventional
digital signal processing. We then summarize recent developments in developing
basic GSP tools, including methods for sampling, filtering or graph learning.
Next, we review progress in several application areas using GSP, including
processing and analysis of sensor network data, biological data, and
applications to image processing and machine learning. We finish by providing a
brief historical perspective to highlight how concepts recently developed in
GSP build on top of prior research in other areas.Comment: To appear, Proceedings of the IEE
K-Deep Simplex: Deep Manifold Learning via Local Dictionaries
We propose K-Deep Simplex (KDS), a unified optimization framework for
nonlinear dimensionality reduction that combines the strengths of manifold
learning and sparse dictionary learning. Our approach learns local dictionaries
that represent a data point with reconstruction coefficients supported on the
probability simplex. The dictionaries are learned using algorithm unrolling, an
increasingly popular technique for structured deep learning. KDS enjoys
tremendous computational advantages over related approaches and is both
interpretable and flexible. In particular, KDS is quasilinear in the number of
data points with scaling that depends on intrinsic geometric properties of the
data. We apply KDS to the unsupervised clustering problem and prove theoretical
performance guarantees. Experiments show that the algorithm is highly efficient
and performs competitively on synthetic and real data sets.Comment: 14 pages, 6 figure
Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit
Subspace clustering methods based on , or nuclear norm
regularization have become very popular due to their simplicity, theoretical
guarantees and empirical success. However, the choice of the regularizer can
greatly impact both theory and practice. For instance, regularization
is guaranteed to give a subspace-preserving affinity (i.e., there are no
connections between points from different subspaces) under broad conditions
(e.g., arbitrary subspaces and corrupted data). However, it requires solving a
large scale convex optimization problem. On the other hand, and
nuclear norm regularization provide efficient closed form solutions, but
require very strong assumptions to guarantee a subspace-preserving affinity,
e.g., independent subspaces and uncorrupted data. In this paper we study a
subspace clustering method based on orthogonal matching pursuit. We show that
the method is both computationally efficient and guaranteed to give a
subspace-preserving affinity under broad conditions. Experiments on synthetic
data verify our theoretical analysis, and applications in handwritten digit and
face clustering show that our approach achieves the best trade off between
accuracy and efficiency.Comment: 13 pages, 1 figure, 2 tables. Accepted to CVPR 2016 as an oral
presentatio
- …