6,421 research outputs found

    From here to infinity - sparse finite versus Dirichlet process mixtures in model-based clustering

    Get PDF
    In model-based-clustering mixture models are used to group data points into clusters. A useful concept introduced for Gaussian mixtures by Malsiner Walli et al (2016) are sparse finite mixtures, where the prior distribution on the weight distribution of a mixture with KK components is chosen in such a way that a priori the number of clusters in the data is random and is allowed to be smaller than KK with high probability. The number of cluster is then inferred a posteriori from the data. The present paper makes the following contributions in the context of sparse finite mixture modelling. First, it is illustrated that the concept of sparse finite mixture is very generic and easily extended to cluster various types of non-Gaussian data, in particular discrete data and continuous multivariate data arising from non-Gaussian clusters. Second, sparse finite mixtures are compared to Dirichlet process mixtures with respect to their ability to identify the number of clusters. For both model classes, a random hyper prior is considered for the parameters determining the weight distribution. By suitable matching of these priors, it is shown that the choice of this hyper prior is far more influential on the cluster solution than whether a sparse finite mixture or a Dirichlet process mixture is taken into consideration.Comment: Accepted versio

    Nonparametric Bayesian methods for one-dimensional diffusion models

    Full text link
    In this paper we review recently developed methods for nonparametric Bayesian inference for one-dimensional diffusion models. We discuss different possible prior distributions, computational issues, and asymptotic results

    MCMC methods for functions modifying old algorithms to make\ud them faster

    Get PDF
    Many problems arising in applications result in the need\ud to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes. Standard MCMC algorithms typically become arbitrarily slow under the mesh refinement dictated by nonparametric description of the unknown function. We describe an approach to modifying a whole range of MCMC methods which ensures that their speed of convergence is robust under mesh refinement. In the applications of interest the data is often sparse and the prior specification is an essential part of the overall modeling strategy. The algorithmic approach that we describe is applicable whenever the desired probability measure has density with respect to a Gaussian process or Gaussian random field prior, and to some useful non-Gaussian priors constructed through random truncation. Applications are shown in density estimation, data assimilation in fluid mechanics, subsurface geophysics and image registration. The key design principle is to formulate the MCMC method for functions. This leads to algorithms which can be implemented via minor modification of existing algorithms, yet which show enormous speed-up on a wide range of applied problems
    corecore