146 research outputs found

    Local RBF approximation for scattered data fitting with bivariate splines

    Get PDF
    In this paper we continue our earlier research [4] aimed at developing effcient methods of local approximation suitable for the first stage of a spline based two-stage scattered data fitting algorithm. As an improvement to the pure polynomial local approximation method used in [5], a hybrid polynomial/radial basis scheme was considered in [4], where the local knot locations for the RBF terms were selected using a greedy knot insertion algorithm. In this paper standard radial local approximations based on interpolation or least squares are considered and a faster procedure is used for knot selection, signicantly reducing the computational cost of the method. Error analysis of the method and numerical results illustrating its performance are given

    Greedy kernel methods for accelerating implicit integrators for parametric ODEs

    Full text link
    We present a novel acceleration method for the solution of parametric ODEs by single-step implicit solvers by means of greedy kernel-based surrogate models. In an offline phase, a set of trajectories is precomputed with a high-accuracy ODE solver for a selected set of parameter samples, and used to train a kernel model which predicts the next point in the trajectory as a function of the last one. This model is cheap to evaluate, and it is used in an online phase for new parameter samples to provide a good initialization point for the nonlinear solver of the implicit integrator. The accuracy of the surrogate reflects into a reduction of the number of iterations until convergence of the solver, thus providing an overall speedup of the full simulation. Interestingly, in addition to providing an acceleration, the accuracy of the solution is maintained, since the ODE solver is still used to guarantee the required precision. Although the method can be applied to a large variety of solvers and different ODEs, we will present in details its use with the Implicit Euler method for the solution of the Burgers equation, which results to be a meaningful test case to demonstrate the method's features

    On thin plate spline interpolation

    Full text link
    We present a simple, PDE-based proof of the result [M. Johnson, 2001] that the error estimates of [J. Duchon, 1978] for thin plate spline interpolation can be improved by h1/2h^{1/2}. We illustrate that H{\mathcal H}-matrix techniques can successfully be employed to solve very large thin plate spline interpolation problem

    Reproducing Kernels of Generalized Sobolev Spaces via a Green Function Approach with Distributional Operators

    Full text link
    In this paper we introduce a generalized Sobolev space by defining a semi-inner product formulated in terms of a vector distributional operator P\mathbf{P} consisting of finitely or countably many distributional operators PnP_n, which are defined on the dual space of the Schwartz space. The types of operators we consider include not only differential operators, but also more general distributional operators such as pseudo-differential operators. We deduce that a certain appropriate full-space Green function GG with respect to L:=P∗TPL:=\mathbf{P}^{\ast T}\mathbf{P} now becomes a conditionally positive definite function. In order to support this claim we ensure that the distributional adjoint operator P∗\mathbf{P}^{\ast} of P\mathbf{P} is well-defined in the distributional sense. Under sufficient conditions, the native space (reproducing-kernel Hilbert space) associated with the Green function GG can be isometrically embedded into or even be isometrically equivalent to a generalized Sobolev space. As an application, we take linear combinations of translates of the Green function with possibly added polynomial terms and construct a multivariate minimum-norm interpolant sf,Xs_{f,X} to data values sampled from an unknown generalized Sobolev function ff at data sites located in some set X⊂RdX \subset \mathbb{R}^d. We provide several examples, such as Mat\'ern kernels or Gaussian kernels, that illustrate how many reproducing-kernel Hilbert spaces of well-known reproducing kernels are isometrically equivalent to a generalized Sobolev space. These examples further illustrate how we can rescale the Sobolev spaces by the vector distributional operator P\mathbf{P}. Introducing the notion of scale as part of the definition of a generalized Sobolev space may help us to choose the "best" kernel function for kernel-based approximation methods.Comment: Update version of the publish at Num. Math. closed to Qi Ye's Ph.D. thesis (\url{http://mypages.iit.edu/~qye3/PhdThesis-2012-AMS-QiYe-IIT.pdf}

    On Fourier transforms of radial functions and distributions

    Full text link
    We find a formula that relates the Fourier transform of a radial function on Rn\mathbf{R}^n with the Fourier transform of the same function defined on Rn+2\mathbf{R}^{n+2}. This formula enables one to explicitly calculate the Fourier transform of any radial function f(r)f(r) in any dimension, provided one knows the Fourier transform of the one-dimensional function t→f(∣t∣)t\to f(|t|) and the two-dimensional function (x1,x2)→f(∣(x1,x2)∣)(x_1,x_2)\to f(|(x_1,x_2)|). We prove analogous results for radial tempered distributions.Comment: 12 page
    • …
    corecore