Model Selection Procedures in Social Research: Monte-Carlo Simulation Results Model selection strategies play an important, if not explicit, role in quantitative research. The inferential properties of these strategies are largely unknown, so there is little basis for recommending (or avoiding) any particular set of strategies. In this paper we evaluate several commonly used model selection procedures (BIC, adjusted R 2, Mallow’s C p, and stepwise regression) using monte-carlo simulation of model selection when the true data-generating processes are known. We find that the ability of these selection procedures to include important variables and exclude irrelevant variables increases with the size of the sample and decreases with the amount of noise in the model. None of the model selection procedures do well in small samples unless the true DGP is almost entirely deterministic; so data mining in small samples should be avoided entirely. In large samples, BIC is better than the other procedures at correctly identifying most of the generating processes we simulated, and Stepwise does even better with some models and almost as well with others. In the absence of strong theory, both BIC and Stepwise appear to be reasonable model selection strategies i
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.