We consider a Bayesian approach to model selection in Gaussian linear
regression, where the number of predictors might be much larger than the number
of observations. From a frequentist view, the proposed procedure results in the
penalized least squares estimation with a complexity penalty associated with a
prior on the model size. We investigate the optimality properties of the
resulting estimator. We establish the oracle inequality and specify conditions
on the prior that imply its asymptotic minimaxity within a wide range of sparse
and dense settings for "nearly-orthogonal" and "multicollinear" designs.Comment: 22 page