thesis

Adaptive minimax estimation in classes of smooth functions

Abstract

In this thesis we study adaptive methods of estimation for two particular types of statistical problems: regression and density estimation. For all these problems the classes of probabilities are parameterized by real-valued functions. In each model, the underlying function is assumed to belong to some class of smooth functions. In practice the `true' smoothness of the function is unknown and so the actual class is also unknown. We study different regression problems with fixed discrete designs: regression on the real line and regression on a bounded interval. Formally, the distinction here lies just in the definition of the underlying functional classes. The construction of optimal adaptive procedures however is quite different in these cases. This is underlined by the essential difference between these two models; namely, in the case of regression models on bounded observation intervals, the presence of the boundary -- the so called boundary effect -- has to be incorporated in the study of optimal statistical procedures. For each of the three problems: regression on the real line, regression on bounded intervals and density estimation, we introduce corresponding scales of functional classes for which exact -- up to constants -- rates of convergence are obtained, under the classical minimax non-parametric framework, i.e. in the case when the classes are known. We proceed then by constructing adaptive estimators and prove them to be asymptotically optimal, for the corresponding functional scales

    Similar works