44 research outputs found
Optimal randomized multilevel algorithms for infinite-dimensional integration on function spaces with ANOVA-type decomposition
In this paper, we consider the infinite-dimensional integration problem on
weighted reproducing kernel Hilbert spaces with norms induced by an underlying
function space decomposition of ANOVA-type. The weights model the relative
importance of different groups of variables. We present new randomized
multilevel algorithms to tackle this integration problem and prove upper bounds
for their randomized error. Furthermore, we provide in this setting the first
non-trivial lower error bounds for general randomized algorithms, which, in
particular, may be adaptive or non-linear. These lower bounds show that our
multilevel algorithms are optimal. Our analysis refines and extends the
analysis provided in [F. J. Hickernell, T. M\"uller-Gronbach, B. Niu, K.
Ritter, J. Complexity 26 (2010), 229-254], and our error bounds improve
substantially on the error bounds presented there. As an illustrative example,
we discuss the unanchored Sobolev space and employ randomized quasi-Monte Carlo
multilevel algorithms based on scrambled polynomial lattice rules.Comment: 31 pages, 0 figure
A universal median quasi-Monte Carlo integration
We study quasi-Monte Carlo (QMC) integration over the multi-dimensional unit
cube in several weighted function spaces with different smoothness classes. We
consider approximating the integral of a function by the median of several
integral estimates under independent and random choices of the underlying QMC
point sets (either linearly scrambled digital nets or infinite-precision
polynomial lattice point sets). Even though our approach does not require any
information on the smoothness and weights of a target function space as an
input, we can prove a probabilistic upper bound on the worst-case error for the
respective weighted function space, where the failure probability converges to
0 exponentially fast as the number of estimates increases. Our obtained rates
of convergence are nearly optimal for function spaces with finite smoothness,
and we can attain a dimension-independent super-polynomial convergence for a
class of infinitely differentiable functions. This implies that our
median-based QMC rule is universal in the sense that it does not need to be
adjusted to the smoothness and the weights of the function spaces and yet
exhibits the nearly optimal rate of convergence. Numerical experiments support
our theoretical results.Comment: Major revision, 32 pages, 4 figure