22,470 research outputs found

    On consistency of nonparametric normal mixtures for Bayesian density estimation.

    Get PDF
    The past decade has seen a remarkable development in the area of Bayesian nonparametric inference both from a theoretical and applied perspective. As for the latter, the celebrated Dirichlet process has been successfully exploited within Bayesian mixture models leading to many interesting applications. As for the former, some new discrete nonparametric priors have been recently proposed in the literature: their natural use is as alternatives to the Dirichlet process in a Bayesian hierarchical model for density estimation. When using such models for concrete applications, an investigation of their statistical properties is mandatory. Among them a prominent role is to be assigned to consistency. Indeed, strong consistency of Bayesian nonparametric procedures for density estimation has been the focus of a considerable amount of research and, in particular, much attention has been devoted to the normal mixture of Dirichlet process. In this paper we improve on previous contributions by establishing strong consistency of the mixture of Dirichlet process under fairly general conditions: besides the usual Kullback–Leibler support condition, consistency is achieved by finiteness of the mean of the base measure of the Dirichlet process and an exponential decay of the prior on the standard deviation. We show that the same conditions are sufficient for mixtures based on priors more general than the Dirichlet process as well. This leads to the easy establishment of consistency for many recently proposed mixture models.Bayesian nonparametrics; Density estimation; Mixture of Dirichlet process; Normal mixture model; Random discrete distribution; Strong consistency

    Posterior analysis for some classes of nonparametric models

    Get PDF
    Recently, James [15, 16] has derived important results for various models in Bayesian nonparametric inference. In particular, he dened a spatial version of neutral to the right processes and derived their posterior distribution. Moreover, he obtained the posterior distribution for an intensity or hazard rate modeled as a mixture under a general multiplicative intensity model. His proofs rely on the so{called Bayesian Poisson partition calculus. Here we provide new proofs based on an alternative technique.Bayesian Nonparametrics; Completely random measure; Hazard rate; Neutral to the right prior; Multiplicative intensity model.

    Bayesian semiparametric stochastic volatility modeling

    Get PDF
    This paper extends the existing fully parametric Bayesian literature on stochastic volatility to allow for more general return distributions. Instead of specifying a particular distribution for the return innovation, nonparametric Bayesian methods are used to flexibly model the skewness and kurtosis of the distribution while the dynamics of volatility continue to be modeled with a parametric structure. Our semiparametric Bayesian approach provides a full characterization of parametric and distributional uncertainty. A Markov chain Monte Carlo sampling approach to estimation is presented with theoretical and computational issues for simulation from the posterior predictive distributions. The new model is assessed based on simulation evidence, an empirical example, and comparison to parametric models.Dirichlet process mixture, MCMC, block sampler

    A hybrid sampler for Poisson-Kingman mixture models

    Get PDF
    This paper concerns the introduction of a new Markov Chain Monte Carlo scheme for posterior sampling in Bayesian nonparametric mixture models with priors that belong to the general Poisson-Kingman class. We present a novel compact way of representing the infinite dimensional component of the model such that while explicitly representing this infinite component it has less memory and storage requirements than previous MCMC schemes. We describe comparative simulation results demonstrating the efficacy of the proposed MCMC algorithm against existing marginal and conditional MCMC samplers

    Dependent Dirichlet Process Rating Model (DDP-RM)

    Full text link
    Typical IRT rating-scale models assume that the rating category threshold parameters are the same over examinees. However, it can be argued that many rating data sets violate this assumption. To address this practical psychometric problem, we introduce a novel, Bayesian nonparametric IRT model for rating scale items. The model is an infinite-mixture of Rasch partial credit models, based on a localized Dependent Dirichlet process (DDP). The model treats the rating thresholds as the random parameters that are subject to the mixture, and has (stick-breaking) mixture weights that are covariate-dependent. Thus, the novel model allows the rating category thresholds to vary flexibly across items and examinees, and allows the distribution of the category thresholds to vary flexibly as a function of covariates. We illustrate the new model through the analysis of a simulated data set, and through the analysis of a real rating data set that is well-known in the psychometric literature. The model is shown to have better predictive-fit performance, compared to other commonly used IRT rating models.Comment: 2 tables and 5 figure
    • …
    corecore