31 research outputs found

    Approximation of conditional densities by smooth mixtures of regressions

    Full text link
    This paper shows that large nonparametric classes of conditional multivariate densities can be approximated in the Kullback--Leibler distance by different specifications of finite mixtures of normal regressions in which normal means and variances and mixing probabilities can depend on variables in the conditioning set (covariates). These models are a special case of models known as "mixtures of experts" in statistics and computer science literature. Flexible specifications include models in which only mixing probabilities, modeled by multinomial logit, depend on the covariates and, in the univariate case, models in which only means of the mixed normals depend flexibly on the covariates. Modeling the variance of the mixed normals by flexible functions of the covariates can weaken restrictions on the class of the approximable densities. Obtained results can be generalized to mixtures of general location scale densities. Rates of convergence and easy to interpret bounds are also obtained for different model specifications. These approximation results can be useful for proving consistency of Bayesian and maximum likelihood density estimators based on these models. The results also have interesting implications for applied researchers.Comment: Published in at http://dx.doi.org/10.1214/09-AOS765 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Continuity and differentiability of expected value functions in dynamic discrete choice models

    Get PDF

    Posterior Consistency in Conditional Density Estimation by Covariate Dependent Mixtures

    Get PDF
    This paper considers Bayesian nonparametric estimation of conditional densities by countable mixtures of location-scale densities with covariate dependent mixing probabilities. The mixing probabilities are modeled in two ways. First, we consider finite covariate dependent mixture models, in which the mixing probabilities are proportional to a product of a constant and a kernel and a prior on the number of mixture components is specified. Second, we consider kernel stick-breaking processes for modeling the mixing probabilities. We show that the posterior in these two models is weakly and strongly consistent for a large class of data generating processes.Bayesian nonparametrics, posterior consistency, conditional density estimation, mixtures of normal distributions, location-scale mixtures, smoothly mixing regressions, mixtures of experts, dependent Dirichlet process, kernel stick-breaking process

    Semiparametric Bayesian Estimation of Dynamic Discrete Choice Models

    Full text link
    We propose a tractable semiparametric estimation method for dynamic discrete choice models.The distribution of additive utility shocks is modeled by location-scale mixtures of extreme value distributions with varying numbers of mixture components. Our approach exploits the analytical tractability of extreme value distributions and the flexibility of the location-scale mixtures. We implement the Bayesian approach to inference using Hamiltonian Monte Carlo and an approximately optimal reversible jump algorithm. For binary dynamic choice model, our approach delivers estimation results that are consistent with the previous literature. We also apply the proposed method to multinomial choice models, for which previous literature does not provide tractable estimation methods in general settings without distributional assumptions on the utility shocks. In our simulation experiments, we show that the standard dynamic logit model can deliver misleading results, especially about counterfactuals, when the shocks are not extreme value distributed. Our semiparametric approach delivers reliable inference in these settings. We develop theoretical results on approximations by location-scale mixtures in an appropriate distance and posterior concentration of the set identified utility parameters and the distribution of shocks in the model

    Adaptive Bayesian Estimation of Mixed Discrete-Continuous Distributions under Smoothness and Sparsity

    Get PDF
    We consider nonparametric estimation of a mixed discrete-continuous distribution under anisotropic smoothness conditions and possibly increasing number of support points for the discrete part of the distribution. For these settings, we derive lower bounds on the estimation rates in the total variation distance. Next, we consider a nonparametric mixture of normals model that uses continuous latent variables for the discrete part of the observations. We show that the posterior in this model contracts at rates that are equal to the derived lower bounds up to a log factor. Thus, Bayesian mixture of normals models can be used for optimal adaptive estimation of mixed discrete-continuous distributions

    Inference in Dynamic Discrete Choice Models With Serially orrelated Unobserved State Variables

    No full text
    This paper develops a method for inference in dynamic discrete choice models with serially correlated unobserved state variables. Estimation of these models involves computing high-dimensional integrals that are present in the solution to the dynamic program and in the likelihood function. First, the paper proposes a Bayesian Markov chain Monte Carlo estimation procedure that can handle the problem of multidimensional integration in the likelihood function. Second, the paper presents an efficient algorithm for solving the dynamic program suitable for use in conjunction with the proposed estimation procedure. Copyright 2009 The Econometric Society.
    corecore