130 research outputs found
Convergence analysis of a Lasserre hierarchy of upper bounds for polynomial minimization on the sphere
We study the convergence rate of a hierarchy of upper bounds for polynomial
minimization problems, proposed by Lasserre [SIAM J. Optim. 21(3) (2011), pp.
864-885], for the special case when the feasible set is the unit (hyper)sphere.
The upper bound at level r of the hierarchy is defined as the minimal expected
value of the polynomial over all probability distributions on the sphere, when
the probability density function is a sum-of-squares polynomial of degree at
most 2r with respect to the surface measure.
We show that the exact rate of convergence is Theta(1/r^2), and explore the
implications for the related rate of convergence for the generalized problem of
moments on the sphere.Comment: 14 pages, 2 figure
Does generalization performance of regularization learning depend on ? A negative example
-regularization has been demonstrated to be an attractive technique in
machine learning and statistical modeling. It attempts to improve the
generalization (prediction) capability of a machine (model) through
appropriately shrinking its coefficients. The shape of a estimator
differs in varying choices of the regularization order . In particular,
leads to the LASSO estimate, while corresponds to the smooth
ridge regression. This makes the order a potential tuning parameter in
applications. To facilitate the use of -regularization, we intend to
seek for a modeling strategy where an elaborative selection on is
avoidable. In this spirit, we place our investigation within a general
framework of -regularized kernel learning under a sample dependent
hypothesis space (SDHS). For a designated class of kernel functions, we show
that all estimators for attain similar generalization
error bounds. These estimated bounds are almost optimal in the sense that up to
a logarithmic factor, the upper and lower bounds are asymptotically identical.
This finding tentatively reveals that, in some modeling contexts, the choice of
might not have a strong impact in terms of the generalization capability.
From this perspective, can be arbitrarily specified, or specified merely by
other no generalization criteria like smoothness, computational complexity,
sparsity, etc..Comment: 35 pages, 3 figure
Optimal polynomial meshes and Caratheodory-Tchakaloff submeshes on the sphere
Using the notion of Dubiner distance, we give an elementary proof of the fact
that good covering point configurations on the 2-sphere are optimal polynomial
meshes. From these we extract Caratheodory-Tchakaloff (CATCH) submeshes for
compressed Least Squares fitting
Inversion of noisy Radon transform by SVD based needlet
A linear method for inverting noisy observations of the Radon transform is
developed based on decomposition systems (needlets) with rapidly decaying
elements induced by the Radon transform SVD basis. Upper bounds of the risk of
the estimator are established in () norms for functions
with Besov space smoothness. A practical implementation of the method is given
and several examples are discussed
- …