36,864 research outputs found

    A hybrid sampler for Poisson-Kingman mixture models

    Get PDF
    This paper concerns the introduction of a new Markov Chain Monte Carlo scheme for posterior sampling in Bayesian nonparametric mixture models with priors that belong to the general Poisson-Kingman class. We present a novel compact way of representing the infinite dimensional component of the model such that while explicitly representing this infinite component it has less memory and storage requirements than previous MCMC schemes. We describe comparative simulation results demonstrating the efficacy of the proposed MCMC algorithm against existing marginal and conditional MCMC samplers

    Bayesian nonparametric Plackett-Luce models for the analysis of preferences for college degree programmes

    Full text link
    In this paper we propose a Bayesian nonparametric model for clustering partial ranking data. We start by developing a Bayesian nonparametric extension of the popular Plackett-Luce choice model that can handle an infinite number of choice items. Our framework is based on the theory of random atomic measures, with the prior specified by a completely random measure. We characterise the posterior distribution given data, and derive a simple and effective Gibbs sampler for posterior simulation. We then develop a Dirichlet process mixture extension of our model and apply it to investigate the clustering of preferences for college degree programmes amongst Irish secondary school graduates. The existence of clusters of applicants who have similar preferences for degree programmes is established and we determine that subject matter and geographical location of the third level institution characterise these clusters.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS717 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Consistency of Markov chain quasi-Monte Carlo on continuous state spaces

    Full text link
    The random numbers driving Markov chain Monte Carlo (MCMC) simulation are usually modeled as independent U(0,1) random variables. Tribble [Markov chain Monte Carlo algorithms using completely uniformly distributed driving sequences (2007) Stanford Univ.] reports substantial improvements when those random numbers are replaced by carefully balanced inputs from completely uniformly distributed sequences. The previous theoretical justification for using anything other than i.i.d. U(0,1) points shows consistency for estimated means, but only applies for discrete stationary distributions. We extend those results to some MCMC algorithms for continuous stationary distributions. The main motivation is the search for quasi-Monte Carlo versions of MCMC. As a side benefit, the results also establish consistency for the usual method of using pseudo-random numbers in place of random ones.Comment: Published in at http://dx.doi.org/10.1214/10-AOS831 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore