12,830 research outputs found

    Radial basis function interpolation in the limit of increasingly flat basis functions

    Get PDF
    We propose a new approach to study Radial Basis Function (RBF) interpolation in the limit of increasingly flat functions. The new approach is based on the semi-analytical computation of the Laurent series of the inverse of the RBF interpolation matrix described in a previous paper [3]. Once the Laurent series is obtained, it can be used to compute the limiting polynomial interpolant, the optimal shape parameter of the RBFs used for interpolation, and the weights of RBF finite difference formulas, among other things.This work has been supported by Spanish MICINN Grants FIS2010-18473, FIS2013-41802-R, CSD2010-00011

    Application of Fredholm integral equations inverse theory to the radial basis function approximation problem

    Get PDF
    This paper reveals and examines the relationship between the solution and stability of Fredholm integral equations and radial basis function approximation or interpolation. The underlying system (kernel) matrices are shown to have a smoothing property which is dependent on the choice of kernel. Instead of using the condition number to describe the ill-conditioning, hence only looking at the largest and smallest singular values of the matrix, techniques from inverse theory, particularly the Picard condition, show that it is understanding the exponential decay of the singular values which is critical for interpreting and mitigating instability. Results on the spectra of certain classes of kernel matrices are reviewed, verifying the exponential decay of the singular values. Numerical results illustrating the application of integral equation inverse theory are also provided and demonstrate that interpolation weights may be regarded as samplings of a weighted solution of an integral equation. This is then relevant for mapping from one set of radial basis function centers to another set. Techniques for the solution of integral equations can be further exploited in future studies to find stable solutions and to reduce the impact of errors in the data

    Bayesian interpolation

    Get PDF
    Although Bayesian analysis has been in use since Laplace, the Bayesian method of model-comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularization and model-comparison is demonstrated by studying the inference problem of interpolating noisy data. The concepts and methods described are quite general and can be applied to many other data modeling problems. Regularizing constants are set by examining their posterior probability distribution. Alternative regularizers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them. “Occam's razor” is automatically embodied by this process. The way in which Bayes infers the values of regularizing constants and noise levels has an elegant interpretation in terms of the effective number of parameters determined by the data set. This framework is due to Gull and Skilling

    Univariate interpolation by exponential functions and gaussian RBFs for generic sets of nodes

    Get PDF
    We consider interpolation of univariate functions on arbitrary sets of nodes by Gaussian radial basis functions or by exponential functions. We derive closed-form expressions for the interpolation error based on the Harish-Chandra-Itzykson-Zuber formula. We then prove the exponential convergence of interpolation for functions analytic in a sufficiently large domain. As an application, we prove the global exponential convergence of optimization by expected improvement for such functions.Comment: Some stylistic improvements and added references following feedback from the reviewer

    How fast do radial basis function interpolants of analytic functions converge?

    Get PDF
    The question in the title is answered using tools of potential theory. Convergence and divergence rates of interpolants of analytic functions on the unit interval are analyzed. The starting point is a complex variable contour integral formula for the remainder in RBF interpolation. We study a generalized Runge phenomenon and explore how the location of centers and affects convergence. Special attention is given to Gaussian and inverse quadratic radial functions, but some of the results can be extended to other smooth basis functions. Among other things, we prove that, under mild conditions, inverse quadratic RBF interpolants of functions that are analytic inside the strip ∣Im(z)∣<(1/2Ï”)|Im(z)| < (1/2\epsilon), where Ï”\epsilon is the shape parameter, converge exponentially
    • 

    corecore