4,115 research outputs found
Bernstein Numbers of Embeddings of Isotropic and Dominating Mixed Besov Spaces
The purpose of the present paper is to investigate the decay of Bernstein
numbers of the embedding from into the space
. The asymptotic behaviour of Bernstein numbers of the
identity will be also
considered.Comment: 31 pages, 1 figur
Metric entropy, n-widths, and sampling of functions on manifolds
We first investigate on the asymptotics of the Kolmogorov metric entropy and
nonlinear n-widths of approximation spaces on some function classes on
manifolds and quasi-metric measure spaces. Secondly, we develop constructive
algorithms to represent those functions within a prescribed accuracy. The
constructions can be based on either spectral information or scattered samples
of the target function. Our algorithmic scheme is asymptotically optimal in the
sense of nonlinear n-widths and asymptotically optimal up to a logarithmic
factor with respect to the metric entropy
Weyl Numbers of Embeddings of Tensor Product Besov Spaces
In this paper we investigate the asymptotic behaviour of Weyl numbers of
embeddings of tensor product Besov spaces into Lebesgue spaces. These results
will be compared with the known behaviour of entropy numbers.Comment: 54 pages, 2 figure
Predictability, complexity and learning
We define {\em predictive information} as the mutual
information between the past and the future of a time series. Three
qualitatively different behaviors are found in the limit of large observation
times : can remain finite, grow logarithmically, or grow
as a fractional power law. If the time series allows us to learn a model with a
finite number of parameters, then grows logarithmically with
a coefficient that counts the dimensionality of the model space. In contrast,
power--law growth is associated, for example, with the learning of infinite
parameter (or nonparametric) models such as continuous functions with
smoothness constraints. There are connections between the predictive
information and measures of complexity that have been defined both in learning
theory and in the analysis of physical systems through statistical mechanics
and dynamical systems theory. Further, in the same way that entropy provides
the unique measure of available information consistent with some simple and
plausible conditions, we argue that the divergent part of
provides the unique measure for the complexity of dynamics underlying a time
series. Finally, we discuss how these ideas may be useful in different problems
in physics, statistics, and biology.Comment: 53 pages, 3 figures, 98 references, LaTeX2
- …