41 research outputs found
A Multiscale Pyramid Transform for Graph Signals
Multiscale transforms designed to process analog and discrete-time signals
and images cannot be directly applied to analyze high-dimensional data residing
on the vertices of a weighted graph, as they do not capture the intrinsic
geometric structure of the underlying graph data domain. In this paper, we
adapt the Laplacian pyramid transform for signals on Euclidean domains so that
it can be used to analyze high-dimensional data residing on the vertices of a
weighted graph. Our approach is to study existing methods and develop new
methods for the four fundamental operations of graph downsampling, graph
reduction, and filtering and interpolation of signals on graphs. Equipped with
appropriate notions of these operations, we leverage the basic multiscale
constructs and intuitions from classical signal processing to generate a
transform that yields both a multiresolution of graphs and an associated
multiresolution of a graph signal on the underlying sequence of graphs.Comment: 16 pages, 13 figure
Signal Processing on Graphs Using Kron Reduction and Spline Interpolation
In applications such as image processing, the data is given in a regular pattern with a known structure, such as a grid of pixels. However, it is becoming increasingly common for large datasets to have some irregular structure. In image recognition, one of the most successful methods is wavelet analysis, also commonly known as multi-resolution analysis. Our project is to develop and explore this powerful technique in the setting where the data is not stored in the form of a rectangular table with rows and columns of pixels. While the data sets will still have a lot of structure to be exploited, we want to extend the wavelet analysis to the setting when the data structure resembles is more like a network than a rectangular table. Networks provide a flexible generalization of the rigid structure of rectangular tables
Spectrally approximating large graphs with smaller graphs
How does coarsening affect the spectrum of a general graph? We provide
conditions such that the principal eigenvalues and eigenspaces of a coarsened
and original graph Laplacian matrices are close. The achieved approximation is
shown to depend on standard graph-theoretic properties, such as the degree and
eigenvalue distributions, as well as on the ratio between the coarsened and
actual graph sizes. Our results carry implications for learning methods that
utilize coarsening. For the particular case of spectral clustering, they imply
that coarse eigenvectors can be used to derive good quality assignments even
without refinement---this phenomenon was previously observed, but lacked formal
justification.Comment: 22 pages, 10 figure