19 research outputs found

    The Penalized Lebesgue Constant for Surface Spline Interpolation

    Full text link
    Problems involving approximation from scattered data where data is arranged quasi-uniformly have been treated by RBF methods for decades. Treating data with spatially varying density has not been investigated with the same intensity, and is far less well understood. In this article we consider the stability of surface spline interpolation (a popular type of RBF interpolation) for data with nonuniform arrangements. Using techniques similar to those recently employed by Hangelbroek, Narcowich and Ward to demonstrate the stability of interpolation from quasi-uniform data on manifolds, we show that surface spline interpolation on R^d is stable, but in a stronger, local sense. We also obtain pointwise estimates showing that the Lagrange function decays very rapidly, and at a rate determined by the local spacing of datasites. These results, in conjunction with a Lebesgue lemma, show that surface spline interpolation enjoys the same rates of convergence as those of the local approximation schemes recently developed by DeVore and Ron.Comment: 20 pages; corrected typos; to appear in Proc. Amer. Math. So

    Nonlinear Approximation Using Gaussian Kernels

    Get PDF
    It is well-known that non-linear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learning and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. At heart it employs the strategy for nonlinear approximation of DeVore and Ron, but it selects kernels by a method that is not straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the approximation rates can be as high as needed and are essentially saturated only by the smoothness of the approximand.Comment: 15 Pages; corrected typos; to appear in J. Funct. Ana

    Kernel Approximation on Manifolds I: Bounding the Lebesgue Constant

    Get PDF
    The purpose of this paper is to establish that for any compact, connected C^{\infty} Riemannian manifold there exists a robust family of kernels of increasing smoothness that are well suited for interpolation. They generate Lagrange functions that are uniformly bounded and decay away from their center at an exponential rate. An immediate corollary is that the corresponding Lebesgue constant will be uniformly bounded with a constant whose only dependence on the set of data sites is reflected in the mesh ratio, which measures the uniformity of the data. The analysis needed for these results was inspired by some fundamental work of Matveev where the Sobolev decay of Lagrange functions associated with certain kernels on \Omega \subset R^d was obtained. With a bit more work, one establishes the following: Lebesgue constants associated with surface splines and Sobolev splines are uniformly bounded on R^d provided the data sites \Xi are quasi-uniformly distributed. The non-Euclidean case is more involved as the geometry of the underlying surface comes into play. In addition to establishing bounded Lebesgue constants in this setting, a "zeros lemma" for compact Riemannian manifolds is established.Comment: 33 pages, 2 figures, new title, accepted for publication in SIAM J. on Math. Ana

    Extending error bounds for radial basis function interpolation to measuring the error in higher order Sobolev norms

    Full text link
    Radial basis functions (RBFs) are prominent examples for reproducing kernels with associated reproducing kernel Hilbert spaces (RKHSs). The convergence theory for the kernel-based interpolation in that space is well understood and optimal rates for the whole RKHS are often known. Schaback added the doubling trick, which shows that functions having double the smoothness required by the RKHS (along with complicated, but well understood boundary behavior) can be approximated with higher convergence rates than the optimal rates for the whole space. Other advances allowed interpolation of target functions which are less smooth, and different norms which measure interpolation error. The current state of the art of error analysis for RBF interpolation treats target functions having smoothness up to twice that of the native space, but error measured in norms which are weaker than that required for membership in the RKHS. Motivated by the fact that the kernels and the approximants they generate are smoother than required by the native space, this article extends the doubling trick to error which measures higher smoothness. This extension holds for a family of kernels satisfying easily checked hypotheses which we describe in this article, and includes many prominent RBFs. In the course of the proof, new convergence rates are obtained for the abstract operator considered by Devore and Ron, and new Bernstein estimates are obtained relating high order smoothness norms to the native space norm

    Surface Spline Approximation on SO(3)

    Get PDF
    The purpose of this article is to introduce a new class of kernels on SO(3) for approximation and interpolation, and to estimate the approximation power of the associated spaces. The kernels we consider arise as linear combinations of Green's functions of certain differential operators on the rotation group. They are conditionally positive definite and have a simple closed-form expression, lending themselves to direct implementation via, e.g., interpolation, or least-squares approximation. To gauge the approximation power of the underlying spaces, we introduce an approximation scheme providing precise L_p error estimates for linear schemes, namely with L_p approximation order conforming to the L_p smoothness of the target function.Comment: 22 pages, to appear in Appl. Comput. Harmon. Ana
    corecore