The paper considers model selection in regression under the additional
structural constraints on admissible models where the number of potential
predictors might be even larger than the available sample size. We develop a
Bayesian formalism as a natural tool for generating a wide class of model
selection criteria based on penalized least squares estimation with various
complexity penalties associated with a prior on a model size. The resulting
criteria are adaptive to structural constraints. We establish the upper bound
for the quadratic risk of the resulting MAP estimator and the corresponding
lower bound for the minimax risk over a set of admissible models of a given
size. We then specify the class of priors (and, therefore, the class of
complexity penalties) where for the "nearly-orthogonal" design the MAP
estimator is asymptotically at least nearly-minimax (up to a log-factor)
simultaneously over an entire range of sparse and dense setups. Moreover, when
the numbers of admissible models are "small" (e.g., ordered variable selection)
or, on the opposite, for the case of complete variable selection, the proposed
estimator achieves the exact minimax rates.Comment: arXiv admin note: text overlap with arXiv:0912.438