503 research outputs found
New error bounds for Legendre approximations of differentiable functions
In this paper we present a new perspective on error analysis of Legendre
approximations for differentiable functions. We start by introducing a sequence
of Legendre-Gauss-Lobatto polynomials and prove their theoretical properties,
such as an explicit and optimal upper bound. We then apply these properties to
derive a new and explicit bound for the Legendre coefficients of differentiable
functions and establish some explicit and optimal error bounds for Legendre
projections in the and norms. Illustrative examples are
provided to demonstrate the sharpness of our new results.Comment: 22 page
On the spectral distribution of kernel matrices related to\ud radial basis functions
This paper focuses on the spectral distribution of kernel matrices related to radial basis functions. The asymptotic behaviour of eigenvalues of kernel matrices related to radial basis functions with different smoothness are studied. These results are obtained by estimated the coefficients of an orthogonal expansion of the underlying kernel function. Beside many other results, we prove that there are exactly (k+d−1/d-1) eigenvalues in the same order for analytic separable kernel functions like the Gaussian in Rd. This gives theoretical support for how to choose the diagonal scaling matrix in the RBF-QR method (Fornberg et al, SIAM J. Sci. Comput. (33), 2011) which can stably compute Gaussian radial basis function interpolants
On the convergence rates of Gauss and Clenshaw-Curtis quadrature for functions of limited regularity
We study the optimal general rate of convergence of the n-point quadrature
rules of Gauss and Clenshaw-Curtis when applied to functions of limited
regularity: if the Chebyshev coefficients decay at a rate O(n^{-s-1}) for some
s > 0, Clenshaw-Curtis and Gauss quadrature inherit exactly this rate. The
proof (for Gauss, if 0 < s < 2, there is numerical evidence only) is based on
work of Curtis, Johnson, Riess, and Rabinowitz from the early 1970s and on a
refined estimate for Gauss quadrature applied to Chebyshev polynomials due to
Petras (1995). The convergence rate of both quadrature rules is up to one power
of n better than polynomial best approximation; hence, the classical proof
strategy that bounds the error of a quadrature rule with positive weights by
polynomial best approximation is doomed to fail in establishing the optimal
rate.Comment: 7 pages, the figure of the revision has an unsymmetric example, to
appear in SIAM J. Numer. Ana
The number of open paths in an oriented -percolation model
We study the asymptotic properties of the number of open paths of length
in an oriented -percolation model. We show that this number is
as . The exponent is
deterministic, it can be expressed in terms of the free energy of a polymer
model, and it can be explicitely computed in some range of the parameters.
Moreover, in a restricted range of the parameters, we even show that the number
of such paths is for some nondegenerate
random variable . We build on connections with the model of directed
polymers in random environment, and we use techniques and results developed in
this context.Comment: 30 pages, 2 figure
- …