3,971 research outputs found
Localized Manifold Harmonics for Spectral Shape Analysis
The use of Laplacian eigenfunctions is ubiquitous in a wide range of computer graphics and geometry processing applications. In particular, Laplacian eigenbases allow generalizing the classical Fourier analysis to manifolds. A key drawback of such bases is their inherently global nature, as the Laplacian eigenfunctions carry geometric and topological structure of the entire manifold. In this paper, we introduce a new framework for local spectral shape analysis. We show how to efficiently construct localized orthogonal bases by solving an optimization problem that in turn can be posed as the eigendecomposition of a new operator obtained by a modification of the standard Laplacian. We study the theoretical and computational aspects of the proposed framework and showcase our new construction on the classical problems of shape approximation and correspondence. We obtain significant improvement compared to classical Laplacian eigenbases as well as other alternatives for constructing localized bases
Spectral Numerical Exterior Calculus Methods for Differential Equations on Radial Manifolds
We develop exterior calculus approaches for partial differential equations on
radial manifolds. We introduce numerical methods that approximate with spectral
accuracy the exterior derivative , Hodge star , and their
compositions. To achieve discretizations with high precision and symmetry, we
develop hyperinterpolation methods based on spherical harmonics and Lebedev
quadrature. We perform convergence studies of our numerical exterior derivative
operator and Hodge star operator
showing each converge spectrally to and . We show how the
numerical operators can be naturally composed to formulate general numerical
approximations for solving differential equations on manifolds. We present
results for the Laplace-Beltrami equations demonstrating our approach.Comment: 22 pages, 13 figure
Functional maps representation on product manifolds
We consider the tasks of representing, analysing and manipulating maps between shapes. We model maps as densities over the product manifold of the input shapes; these densities can be treated as scalar functions and therefore are manipulable using the language of signal processing on manifolds. Being a manifold itself, the product space endows the set of maps with a geometry of its own, which we exploit to define map operations in the spectral domain; we also derive relationships with other existing representations (soft maps and functional maps). To apply these ideas in practice, we discretize product manifolds and their Laplace–Beltrami operators, and we introduce localized spectral analysis of the product manifold as a novel tool for map processing. Our framework applies to maps defined between and across 2D and 3D shapes without requiring special adjustment, and it can be implemented efficiently with simple operations on sparse matrices
Functional Maps Representation on Product Manifolds
We consider the tasks of representing, analyzing and manipulating maps
between shapes. We model maps as densities over the product manifold of the
input shapes; these densities can be treated as scalar functions and therefore
are manipulable using the language of signal processing on manifolds. Being a
manifold itself, the product space endows the set of maps with a geometry of
its own, which we exploit to define map operations in the spectral domain; we
also derive relationships with other existing representations (soft maps and
functional maps). To apply these ideas in practice, we discretize product
manifolds and their Laplace--Beltrami operators, and we introduce localized
spectral analysis of the product manifold as a novel tool for map processing.
Our framework applies to maps defined between and across 2D and 3D shapes
without requiring special adjustment, and it can be implemented efficiently
with simple operations on sparse matrices.Comment: Accepted to Computer Graphics Foru
Geometric Wavelet Scattering Networks on Compact Riemannian Manifolds
The Euclidean scattering transform was introduced nearly a decade ago to
improve the mathematical understanding of convolutional neural networks.
Inspired by recent interest in geometric deep learning, which aims to
generalize convolutional neural networks to manifold and graph-structured
domains, we define a geometric scattering transform on manifolds. Similar to
the Euclidean scattering transform, the geometric scattering transform is based
on a cascade of wavelet filters and pointwise nonlinearities. It is invariant
to local isometries and stable to certain types of diffeomorphisms. Empirical
results demonstrate its utility on several geometric learning tasks. Our
results generalize the deformation stability and local translation invariance
of Euclidean scattering, and demonstrate the importance of linking the used
filter structures to the underlying geometry of the data.Comment: 35 pages; 3 figures; 2 tables; v3: Revisions based on reviewer
comment
Exact heat kernel on a hypersphere and its applications in kernel SVM
Many contemporary statistical learning methods assume a Euclidean feature
space. This paper presents a method for defining similarity based on
hyperspherical geometry and shows that it often improves the performance of
support vector machine compared to other competing similarity measures.
Specifically, the idea of using heat diffusion on a hypersphere to measure
similarity has been previously proposed, demonstrating promising results based
on a heuristic heat kernel obtained from the zeroth order parametrix expansion;
however, how well this heuristic kernel agrees with the exact hyperspherical
heat kernel remains unknown. This paper presents a higher order parametrix
expansion of the heat kernel on a unit hypersphere and discusses several
problems associated with this expansion method. We then compare the heuristic
kernel with an exact form of the heat kernel expressed in terms of a uniformly
and absolutely convergent series in high-dimensional angular momentum
eigenmodes. Being a natural measure of similarity between sample points
dwelling on a hypersphere, the exact kernel often shows superior performance in
kernel SVM classifications applied to text mining, tumor somatic mutation
imputation, and stock market analysis
- …