18,911 research outputs found

    Polyharmonic approximation on the sphere

    Full text link
    The purpose of this article is to provide new error estimates for a popular type of SBF approximation on the sphere: approximating by linear combinations of Green's functions of polyharmonic differential operators. We show that the LpL_p approximation order for this kind of approximation is σ\sigma for functions having LpL_p smoothness σ\sigma (for σ\sigma up to the order of the underlying differential operator, just as in univariate spline theory). This is an improvement over previous error estimates, which penalized the approximation order when measuring error in LpL_p, p>2 and held only in a restrictive setting when measuring error in LpL_p, p<2.Comment: 16 pages; revised version; to appear in Constr. Appro

    Nonlinear Approximation Using Gaussian Kernels

    Get PDF
    It is well-known that non-linear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learning and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. At heart it employs the strategy for nonlinear approximation of DeVore and Ron, but it selects kernels by a method that is not straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the approximation rates can be as high as needed and are essentially saturated only by the smoothness of the approximand.Comment: 15 Pages; corrected typos; to appear in J. Funct. Ana

    Penalized Likelihood and Bayesian Function Selection in Regression Models

    Full text link
    Challenging research in various fields has driven a wide range of methodological advances in variable selection for regression models with high-dimensional predictors. In comparison, selection of nonlinear functions in models with additive predictors has been considered only more recently. Several competing suggestions have been developed at about the same time and often do not refer to each other. This article provides a state-of-the-art review on function selection, focusing on penalized likelihood and Bayesian concepts, relating various approaches to each other in a unified framework. In an empirical comparison, also including boosting, we evaluate several methods through applications to simulated and real data, thereby providing some guidance on their performance in practice

    Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint

    Full text link
    Inspired by ideas taken from the machine learning literature, new regularization techniques have been recently introduced in linear system identification. In particular, all the adopted estimators solve a regularized least squares problem, differing in the nature of the penalty term assigned to the impulse response. Popular choices include atomic and nuclear norms (applied to Hankel matrices) as well as norms induced by the so called stable spline kernels. In this paper, a comparative study of estimators based on these different types of regularizers is reported. Our findings reveal that stable spline kernels outperform approaches based on atomic and nuclear norms since they suitably embed information on impulse response stability and smoothness. This point is illustrated using the Bayesian interpretation of regularization. We also design a new class of regularizers defined by "integral" versions of stable spline/TC kernels. Under quite realistic experimental conditions, the new estimators outperform classical prediction error methods also when the latter are equipped with an oracle for model order selection

    Polynomial (chaos) approximation of maximum eigenvalue functions: efficiency and limitations

    Full text link
    This paper is concerned with polynomial approximations of the spectral abscissa function (the supremum of the real parts of the eigenvalues) of a parameterized eigenvalue problem, which are closely related to polynomial chaos approximations if the parameters correspond to realizations of random variables. Unlike in existing works, we highlight the major role of the smoothness properties of the spectral abscissa function. Even if the matrices of the eigenvalue problem are analytic functions of the parameters, the spectral abscissa function may not be everywhere differentiable, even not everywhere Lipschitz continuous, which is related to multiple rightmost eigenvalues or rightmost eigenvalues with multiplicity higher than one. The presented analysis demonstrates that the smoothness properties heavily affect the approximation errors of the Galerkin and collocation-based polynomial approximations, and the numerical errors of the evaluation of coefficients with integration methods. A documentation of the experiments, conducted on the benchmark problems through the software Chebfun, is publicly available.Comment: This is a pre-print of an article published in Numerical Algorithms. The final authenticated version is available online at: https://doi.org/10.1007/s11075-018-00648-

    On Homogeneous Decomposition Spaces and Associated Decompositions of Distribution Spaces

    Full text link
    A new construction of decomposition smoothness spaces of homogeneous type is considered. The smoothness spaces are based on structured and flexible decompositions of the frequency space Rd\{0}\mathbb{R}^d\backslash\{0\}. We construct simple adapted tight frames for L2(Rd)L_2(\mathbb{R}^d) that can be used to fully characterise the smoothness norm in terms of a sparseness condition imposed on the frame coefficients. Moreover, it is proved that the frames provide a universal decomposition of tempered distributions with convergence in the tempered distributions modulo polynomials. As an application of the general theory, the notion of homogeneous α\alpha-modulation spaces is introduced.Comment: 27 page

    Surface Spline Approximation on SO(3)

    Get PDF
    The purpose of this article is to introduce a new class of kernels on SO(3) for approximation and interpolation, and to estimate the approximation power of the associated spaces. The kernels we consider arise as linear combinations of Green's functions of certain differential operators on the rotation group. They are conditionally positive definite and have a simple closed-form expression, lending themselves to direct implementation via, e.g., interpolation, or least-squares approximation. To gauge the approximation power of the underlying spaces, we introduce an approximation scheme providing precise L_p error estimates for linear schemes, namely with L_p approximation order conforming to the L_p smoothness of the target function.Comment: 22 pages, to appear in Appl. Comput. Harmon. Ana
    • …
    corecore