4 research outputs found
Weighted L_2 B Discrepancy and Approximation of Integrals over Reproducing Kernel Hilbert Spaces
We extend the notion of discrepancy provided in
[E. Novak, H. Wo\u27zniakowski, discrepancy and multivariate
integration, in: Analytic number theory. Essays in honour of Klaus
Roth. W. W. L. Chen, W. T. Gowers, H. Halberstam, W. M. Schmidt,
and R. C. Vaughan (Eds.), Cambridge University Press, Cambridge,
2009, 359 -- 388] to the weighted discrepancy.
This newly defined notion allows to
consider weights, but also volume measures different from the Lebesgue
measure and classes of test sets different from measurable subsets
of some Euclidean space.
We relate the weighted discrepancy to numerical
integration defined over weighted reproducing kernel Hilbert spaces
and settle in this way an open problem posed by Novak and
Wo\u27zniakowski
Learning curves of generic features maps for realistic datasets with a teacher-student model
Teacher-student models provide a framework in which the typical-case
performance of high-dimensional supervised learning can be described in closed
form. The assumptions of Gaussian i.i.d. input data underlying the canonical
teacher-student model may, however, be perceived as too restrictive to capture
the behaviour of realistic data sets. In this paper, we introduce a Gaussian
covariate generalisation of the model where the teacher and student can act on
different spaces, generated with fixed, but generic feature maps. While still
solvable in a closed form, this generalization is able to capture the learning
curves for a broad range of realistic data sets, thus redeeming the potential
of the teacher-student framework. Our contribution is then two-fold: First, we
prove a rigorous formula for the asymptotic training loss and generalisation
error. Second, we present a number of situations where the learning curve of
the model captures the one of a realistic data set learned with kernel
regression and classification, with out-of-the-box feature maps such as random
projections or scattering transforms, or with pre-learned ones - such as the
features learned by training multi-layer neural networks. We discuss both the
power and the limitations of the framework.Comment: v3: NeurIPS camera-read
Generalized averaged Gaussian quadrature and applications
A simple numerical method for constructing the optimal generalized averaged Gaussian quadrature formulas will be presented. These formulas exist in many cases in which real positive GaussKronrod formulas do not exist, and can be used as an adequate alternative in order to estimate the error of a Gaussian rule. We also investigate the conditions under which the optimal averaged Gaussian quadrature formulas and their truncated variants are internal
MS FT-2-2 7 Orthogonal polynomials and quadrature: Theory, computation, and applications
Quadrature rules find many applications in science and engineering. Their analysis is a classical area of applied mathematics and continues to attract considerable attention. This seminar brings together speakers with expertise in a large variety of quadrature rules. It is the aim of the seminar to provide an overview of recent developments in the analysis of quadrature rules. The computation of error estimates and novel applications also are described