3 research outputs found

    Informative Priors for the Consensus Ranking in the Bayesian Mallows Model

    Get PDF
    The aim of this work is to study the problem of prior elicitation for the consensus ranking in the Mallows model with Spearman’s distance, a popular distance-based model for rankings or permutation data. Previous Bayesian inference for such a model has been limited to the use of the uniform prior over the space of permutations. We present a novel strategy to elicit informative prior beliefs on the location parameter of the model, discussing the interpretation of hyper-parameters and the implication of prior choices for the posterior analysis

    Efficient and accurate inference for mixtures of Mallows models with Spearman distance

    Full text link
    The Mallows model occupies a central role in parametric modelling of ranking data to learn preferences of a population of judges. Despite the wide range of metrics for rankings that can be considered in the model specification, the choice is typically limited to the Kendall, Cayley or Hamming distances, due to the closed-form expression of the related model normalizing constant. This work instead focuses on the Mallows model with Spearman distance. An efficient and accurate EM algorithm for estimating finite mixtures of Mallows models with Spearman distance is developed, by relying on a twofold data augmentation strategy aimed at i) enlarging the applicability of Mallows models to samples drawn from heterogeneous populations; ii) dealing with partial rankings affected by diverse forms of censoring. Additionally, a novel approximation of the model normalizing constant is introduced to support the challenging model-based clustering of rankings with a large number of items. The inferential ability of the EM scheme and the effectiveness of the approximation are assessed by extensive simulation studies. Finally, we show that the application to three real-world datasets endorses our proposals also in the comparison with competing mixtures of ranking models.Comment: 20 pages, 6 Figures, 11 Table
    corecore