Skip to main content
Article thumbnail
Location of Repository

Minimax optimality of the local multi-resolution projection estimator over Besov spaces

By Jean-Baptiste Monnier


13 pages, 1 figureThe local multi-resolution projection estimator (LMPE) has been first introduced in Monnier, "Classification via local multi-resolution projections", EJS 2011 [Monnier2011]. It was proved there that plug-in classifiers built upon the LMPE can reach super-fast rates under a margin assumption. As a by-product, the LMPE was also proved to be near minimax optimal in the regression setting over a wide generalized Lipschitz (or Hölder) scale. In this paper, we show that a direct treatment of the regression loss allows to generalize the minimax optimality of the LMPE to a much wider Besov scale. To be more precise, we prove that the LMPE is near minimax optimal over Besov spaces $B^s_{\tau,q}$, $s >0$, $\tau \geq p$, $q>0$, when the loss is measured in $\Lp_p$-norm, $p \in [2,\infty)$ (see Theorem 2.1), and over Besov spaces $B^s_{\tau,q}$, $s > d/\tau$, $\tau, q >0$, when the loss is measured in $\Lp_{\infty}$-norm (see Theorem 2.2). Moreover, we show that an appropriate version of Lepski's method allows to make these results adaptive. Interestingly, all the proofs detailed here are largely different from the ones given in [Monnier2011]

Topics: Nonparametric regression, Random design, Multi-resolution analysis, 62G05, 62G08, [ STAT.TH ] Statistics [stat]/Statistics Theory [stat.TH]
Publisher: HAL CCSD
Year: 2012
OAI identifier: oai:HAL:hal-00674091v2
Provided by: Hal-Diderot

Suggested articles

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.