523 research outputs found

    Localized bases for kernel spaces on the unit sphere

    Get PDF
    Approximation/interpolation from spaces of positive definite or conditionally positive definite kernels is an increasingly popular tool for the analysis and synthesis of scattered data, and is central to many meshless methods. For a set of NN scattered sites, the standard basis for such a space utilizes NN \emph{globally} supported kernels; computing with it is prohibitively expensive for large NN. Easily computable, well-localized bases, with "small-footprint" basis elements - i.e., elements using only a small number of kernels -- have been unavailable. Working on \sphere, with focus on the restricted surface spline kernels (e.g. the thin-plate splines restricted to the sphere), we construct easily computable, spatially well-localized, small-footprint, robust bases for the associated kernel spaces. Our theory predicts that each element of the local basis is constructed by using a combination of only O((log⁥N)2)\mathcal{O}((\log N)^2) kernels, which makes the construction computationally cheap. We prove that the new basis is LpL_p stable and satisfies polynomial decay estimates that are stationary with respect to the density of the data sites, and we present a quasi-interpolation scheme that provides optimal LpL_p approximation orders. Although our focus is on S2\mathbb{S}^2, much of the theory applies to other manifolds - Sd\mathbb{S}^d, the rotation group, and so on. Finally, we construct algorithms to implement these schemes and use them to conduct numerical experiments, which validate our theory for interpolation problems on S2\mathbb{S}^2 involving over one hundred fifty thousand data sites.Comment: This article supersedes arXiv:1111.1013 "Better bases for kernel spaces," which proved existence of better bases for various kernel spaces. This article treats a smaller class of kernels, but presents an algorithm for constructing better bases and demonstrates its effectiveness with more elaborate examples. A quasi-interpolation scheme is introduced that provides optimal linear convergence rate

    On Recursive Edit Distance Kernels with Application to Time Series Classification

    Get PDF
    This paper proposes some extensions to the work on kernels dedicated to string or time series global alignment based on the aggregation of scores obtained by local alignments. The extensions we propose allow to construct, from classical recursive definition of elastic distances, recursive edit distance (or time-warp) kernels that are positive definite if some sufficient conditions are satisfied. The sufficient conditions we end-up with are original and weaker than those proposed in earlier works, although a recursive regularizing term is required to get the proof of the positive definiteness as a direct consequence of the Haussler's convolution theorem. The classification experiment we conducted on three classical time warp distances (two of which being metrics), using Support Vector Machine classifier, leads to conclude that, when the pairwise distance matrix obtained from the training data is \textit{far} from definiteness, the positive definite recursive elastic kernels outperform in general the distance substituting kernels for the classical elastic distances we have tested.Comment: 14 page

    Sliced Wasserstein Kernel for Persistence Diagrams

    Get PDF
    Persistence diagrams (PDs) play a key role in topological data analysis (TDA), in which they are routinely used to describe topological properties of complicated shapes. PDs enjoy strong stability properties and have proven their utility in various learning contexts. They do not, however, live in a space naturally endowed with a Hilbert structure and are usually compared with specific distances, such as the bottleneck distance. To incorporate PDs in a learning pipeline, several kernels have been proposed for PDs with a strong emphasis on the stability of the RKHS distance w.r.t. perturbations of the PDs. In this article, we use the Sliced Wasserstein approximation SW of the Wasserstein distance to define a new kernel for PDs, which is not only provably stable but also provably discriminative (depending on the number of points in the PDs) w.r.t. the Wasserstein distance d1d_1 between PDs. We also demonstrate its practicality, by developing an approximation technique to reduce kernel computation time, and show that our proposal compares favorably to existing kernels for PDs on several benchmarks.Comment: Minor modification

    Stable Bases for Kernel Based Methods

    Get PDF

    Highly Localized RBF Lagrange Functions for Finite Difference Methods on Spheres

    Full text link
    The aim of this paper is to show how rapidly decaying RBF Lagrange functions on the spheres can be used to create effective, stable finite difference methods based on radial basis functions (RBF-FD). For certain classes of PDEs this approach leads to precise convergence estimates for stencils which grow moderately with increasing discretization fineness
    • 

    corecore