149,347 research outputs found
Polynomial Interpolation and Applications to Autoregressive Models
Polynomial interpolation can be used to approximate functions and their derivatives. Some autoregressive models can be stated by using polynomial interpolation and function approximation.polynomial interpolation; autoregressive models;
A Discrete Adapted Hierarchical Basis Solver For Radial Basis Function Interpolation
In this paper we develop a discrete Hierarchical Basis (HB) to efficiently
solve the Radial Basis Function (RBF) interpolation problem with variable
polynomial order. The HB forms an orthogonal set and is adapted to the kernel
seed function and the placement of the interpolation nodes. Moreover, this
basis is orthogonal to a set of polynomials up to a given order defined on the
interpolating nodes. We are thus able to decouple the RBF interpolation problem
for any order of the polynomial interpolation and solve it in two steps: (1)
The polynomial orthogonal RBF interpolation problem is efficiently solved in
the transformed HB basis with a GMRES iteration and a diagonal, or block SSOR
preconditioner. (2) The residual is then projected onto an orthonormal
polynomial basis. We apply our approach on several test cases to study its
effectiveness, including an application to the Best Linear Unbiased Estimator
regression problem
General Linearized Polynomial Interpolation and Its Applications
In this paper, we first propose a general interpolation algorithm in a free
module of a linearized polynomial ring, and then apply this algorithm to decode
several important families of codes, Gabidulin codes, KK codes and MV codes.
Our decoding algorithm for Gabidulin codes is different from the polynomial
reconstruction algorithm by Loidreau. When applied to decode KK codes, our
interpolation algorithm is equivalent to the Sudan-style list-1 decoding
algorithm proposed by K/"otter and Kschischang for KK codes. The general
interpolation approach is also capable of solving the interpolation problem for
the list decoding of MV codes proposed by Mahdavifar and Vardy, and has a lower
complexity than solving linear equations
Counterexample-Guided Polynomial Loop Invariant Generation by Lagrange Interpolation
We apply multivariate Lagrange interpolation to synthesize polynomial
quantitative loop invariants for probabilistic programs. We reduce the
computation of an quantitative loop invariant to solving constraints over
program variables and unknown coefficients. Lagrange interpolation allows us to
find constraints with less unknown coefficients. Counterexample-guided
refinement furthermore generates linear constraints that pinpoint the desired
quantitative invariants. We evaluate our technique by several case studies with
polynomial quantitative loop invariants in the experiments
On the constrained mock-Chebyshev least-squares
The algebraic polynomial interpolation on uniformly distributed nodes is
affected by the Runge phenomenon, also when the function to be interpolated is
analytic. Among all techniques that have been proposed to defeat this
phenomenon, there is the mock-Chebyshev interpolation which is an interpolation
made on a subset of the given nodes whose elements mimic as well as possible
the Chebyshev-Lobatto points. In this work we use the simultaneous
approximation theory to combine the previous technique with a polynomial
regression in order to increase the accuracy of the approximation of a given
analytic function. We give indications on how to select the degree of the
simultaneous regression in order to obtain polynomial approximant good in the
uniform norm and provide a sufficient condition to improve, in that norm, the
accuracy of the mock-Chebyshev interpolation with a simultaneous regression.
Numerical results are provided.Comment: 17 pages, 9 figure
- …
