research

Bayesian Nonparametric Estimation and Consistency of Mixed Multinomial Logit Choice Models

Abstract

This paper develops nonparametric estimation for discrete choice models based on the Mixed Multinomial Logit (MMNL) model. It has been shown that MMNL models encompass all discrete choice models derived under the assumption of random utility maximization, subject to the identification of an unknown distribution G. Noting the mixture model description of the MMNL, we employ a Bayesian nonparametric approach, using nonparametric priors on the unknown mixing distribution G, to estimate the unknown choice probabilities. Theoretical support for the use of the proposed methodology is provided by establishing strong consistency of a general nonparametric prior on G under simple sufficient conditions. Consistency is defined according to a L1-type distance on the space of choice probabilities and is achieved by extending to a regression model framework a recent approach to strong consistency based on the summability of square roots of prior probabilities. Moving to estimation, slightly different techniques for non-panel and panel data models are discussed. For practical implementation, we describe efficient and relatively easy to use blocked Gibbs sampling procedures. A simulation study is also performed to illustrate the proposed methods and the exibility they achieve with respect to parametric Gaussian MMNL models.Bayesian consistency, Bayesian nonparametrics, Blocked Gibbs sampler, Discrete choice models, Mixed Multinomial Logit, Random probability measures, Stick-breaking priors

    Similar works