2,331 research outputs found

    Rapid evaluation of radial basis functions

    Get PDF
    Over the past decade, the radial basis function method has been shown to produce high quality solutions to the multivariate scattered data interpolation problem. However, this method has been associated with very high computational cost, as compared to alternative methods such as finite element or multivariate spline interpolation. For example. the direct evaluation at M locations of a radial basis function interpolant with N centres requires O(M N) floating-point operations. In this paper we introduce a fast evaluation method based on the Fast Gauss Transform and suitable quadrature rules. This method has been applied to the Hardy multiquadric, the inverse multiquadric and the thin-plate spline to reduce the computational complexity of the interpolant evaluation to O(M + N) floating point operations. By using certain localisation properties of conditionally negative definite functions this method has several performance advantages against traditional hierarchical rapid summation methods which we discuss in detail

    On spherical averages of radial basis functions

    Get PDF
    A radial basis function (RBF) has the general form s(x)=∑k=1nakϕ(x−bk),x∈Rd,s(x)=\sum_{k=1}^{n}a_{k}\phi(x-b_{k}),\quad x\in\mathbb{R}^{d}, where the coefficients a 1,
,a n are real numbers, the points, or centres, b 1,
,b n lie in ℝ d , and φ:ℝ d →ℝ is a radially symmetric function. Such approximants are highly useful and enjoy rich theoretical properties; see, for instance (Buhmann, Radial Basis Functions: Theory and Implementations, [2003]; Fasshauer, Meshfree Approximation Methods with Matlab, [2007]; Light and Cheney, A Course in Approximation Theory, [2000]; or Wendland, Scattered Data Approximation, [2004]). The important special case of polyharmonic splines results when φ is the fundamental solution of the iterated Laplacian operator, and this class includes the Euclidean norm φ(x)=‖x‖ when d is an odd positive integer, the thin plate spline φ(x)=‖x‖2log  ‖x‖ when d is an even positive integer, and univariate splines. Now B-splines generate a compactly supported basis for univariate spline spaces, but an analyticity argument implies that a nontrivial polyharmonic spline generated by (1.1) cannot be compactly supported when d>1. However, a pioneering paper of Jackson (Constr. Approx. 4:243–264, [1988]) established that the spherical average of a radial basis function generated by the Euclidean norm can be compactly supported when the centres and coefficients satisfy certain moment conditions; Jackson then used this compactly supported spherical average to construct approximate identities, with which he was then able to derive some of the earliest uniform convergence results for a class of radial basis functions. Our work extends this earlier analysis, but our technique is entirely novel, and applies to all polyharmonic splines. Furthermore, we observe that the technique provides yet another way to generate compactly supported, radially symmetric, positive definite functions. Specifically, we find that the spherical averaging operator commutes with the Fourier transform operator, and we are then able to identify Fourier transforms of compactly supported functions using the Paley–Wiener theorem. Furthermore, the use of Haar measure on compact Lie groups would not have occurred without frequent exposure to Iserles’s study of geometric integration

    TVL<sub>1</sub> Planarity Regularization for 3D Shape Approximation

    Get PDF
    The modern emergence of automation in many industries has given impetus to extensive research into mobile robotics. Novel perception technologies now enable cars to drive autonomously, tractors to till a field automatically and underwater robots to construct pipelines. An essential requirement to facilitate both perception and autonomous navigation is the analysis of the 3D environment using sensors like laser scanners or stereo cameras. 3D sensors generate a very large number of 3D data points when sampling object shapes within an environment, but crucially do not provide any intrinsic information about the environment which the robots operate within. This work focuses on the fundamental task of 3D shape reconstruction and modelling from 3D point clouds. The novelty lies in the representation of surfaces by algebraic functions having limited support, which enables the extraction of smooth consistent implicit shapes from noisy samples with a heterogeneous density. The minimization of total variation of second differential degree makes it possible to enforce planar surfaces which often occur in man-made environments. Applying the new technique means that less accurate, low-cost 3D sensors can be employed without sacrificing the 3D shape reconstruction accuracy

    Interpolating point spread function anisotropy

    Full text link
    Planned wide-field weak lensing surveys are expected to reduce the statistical errors on the shear field to unprecedented levels. In contrast, systematic errors like those induced by the convolution with the point spread function (PSF) will not benefit from that scaling effect and will require very accurate modeling and correction. While numerous methods have been devised to carry out the PSF correction itself, modeling of the PSF shape and its spatial variations across the instrument field of view has, so far, attracted much less attention. This step is nevertheless crucial because the PSF is only known at star positions while the correction has to be performed at any position on the sky. A reliable interpolation scheme is therefore mandatory and a popular approach has been to use low-order bivariate polynomials. In the present paper, we evaluate four other classical spatial interpolation methods based on splines (B-splines), inverse distance weighting (IDW), radial basis functions (RBF) and ordinary Kriging (OK). These methods are tested on the Star-challenge part of the GRavitational lEnsing Accuracy Testing 2010 (GREAT10) simulated data and are compared with the classical polynomial fitting (Polyfit). We also test all our interpolation methods independently of the way the PSF is modeled, by interpolating the GREAT10 star fields themselves (i.e., the PSF parameters are known exactly at star positions). We find in that case RBF to be the clear winner, closely followed by the other local methods, IDW and OK. The global methods, Polyfit and B-splines, are largely behind, especially in fields with (ground-based) turbulent PSFs. In fields with non-turbulent PSFs, all interpolators reach a variance on PSF systematics σsys2\sigma_{sys}^2 better than the 1×10−71\times10^{-7} upper bound expected by future space-based surveys, with the local interpolators performing better than the global ones

    On thin plate spline interpolation

    Full text link
    We present a simple, PDE-based proof of the result [M. Johnson, 2001] that the error estimates of [J. Duchon, 1978] for thin plate spline interpolation can be improved by h1/2h^{1/2}. We illustrate that H{\mathcal H}-matrix techniques can successfully be employed to solve very large thin plate spline interpolation problem

    Knot selection by boosting techniques

    Get PDF
    A novel concept for estimating smooth functions by selection techniques based on boosting is developed. It is suggested to put radial basis functions with different spreads at each knot and to do selection and estimation simultaneously by a componentwise boosting algorithm. The methodology of various other smoothing and knot selection procedures (e.g. stepwise selection) is summarized. They are compared to the proposed approach by extensive simulations for various unidimensional settings, including varying spatial variation and heteroskedasticity, as well as on a real world data example. Finally, an extension of the proposed method to surface fitting is evaluated numerically on both, simulation and real data. The proposed knot selection technique is shown to be a strong competitor to existing methods for knot selection

    Error bound for radial basis interpolation in terms of a growth function

    Get PDF
    We suggest an improvement of Wu-Schaback local error bound for radial basis interpolation by using a polynomial growth function. The new bound is valid without any assumptions about the density of the interpolation centers. It can be useful for the localized methods of scattered data fitting and for the meshless discretization of partial differential equation

    A fast semi-direct least squares algorithm for hierarchically block separable matrices

    Full text link
    We present a fast algorithm for linear least squares problems governed by hierarchically block separable (HBS) matrices. Such matrices are generally dense but data-sparse and can describe many important operators including those derived from asymptotically smooth radial kernels that are not too oscillatory. The algorithm is based on a recursive skeletonization procedure that exposes this sparsity and solves the dense least squares problem as a larger, equality-constrained, sparse one. It relies on a sparse QR factorization coupled with iterative weighted least squares methods. In essence, our scheme consists of a direct component, comprised of matrix compression and factorization, followed by an iterative component to enforce certain equality constraints. At most two iterations are typically required for problems that are not too ill-conditioned. For an M×NM \times N HBS matrix with M≄NM \geq N having bounded off-diagonal block rank, the algorithm has optimal O(M+N)\mathcal{O} (M + N) complexity. If the rank increases with the spatial dimension as is common for operators that are singular at the origin, then this becomes O(M+N)\mathcal{O} (M + N) in 1D, O(M+N3/2)\mathcal{O} (M + N^{3/2}) in 2D, and O(M+N2)\mathcal{O} (M + N^{2}) in 3D. We illustrate the performance of the method on both over- and underdetermined systems in a variety of settings, with an emphasis on radial basis function approximation and efficient updating and downdating.Comment: 24 pages, 8 figures, 6 tables; to appear in SIAM J. Matrix Anal. App
    • 

    corecore