For long-memory time series, inference based on resampling is of crucial
importance, since the asymptotic distribution can often be non-Gaussian and is
difficult to determine statistically. However due to the strong dependence,
establishing the asymptotic validity of resampling methods is nontrivial. In
this paper, we derive an efficient bound for the canonical correlation between
two finite blocks of a long-memory time series. We show how this bound can be
applied to establish the asymptotic consistency of subsampling procedures for
general statistics under long memory. It allows the subsample size b to be
o(n), where n is the sample size, irrespective of the strength of the
memory. We are then able to improve many results found in the literature. We
also consider applications of subsampling procedures under long memory to the
sample covariance, M-estimation and empirical processes.Comment: 36 pages. To appear in The Annals of Statistic