6,600 research outputs found

    Dynamic density estimation with diffusive Dirichlet mixtures

    Get PDF
    We introduce a new class of nonparametric prior distributions on the space of continuously varying densities, induced by Dirichlet process mixtures which diffuse in time. These select time-indexed random functions without jumps, whose sections are continuous or discrete distributions depending on the choice of kernel. The construction exploits the widely used stick-breaking representation of the Dirichlet process and induces the time dependence by replacing the stick-breaking components with one-dimensional Wright-Fisher diffusions. These features combine appealing properties of the model, inherited from the Wright-Fisher diffusions and the Dirichlet mixture structure, with great flexibility and tractability for posterior computation. The construction can be easily extended to multi-parameter GEM marginal states, which include, for example, the Pitman--Yor process. A full inferential strategy is detailed and illustrated on simulated and real data.Comment: Published at http://dx.doi.org/10.3150/14-BEJ681 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process Mixture

    Get PDF
    This paper presents a novel algorithm, based upon the dependent Dirichlet process mixture model (DDPMM), for clustering batch-sequential data containing an unknown number of evolving clusters. The algorithm is derived via a low-variance asymptotic analysis of the Gibbs sampling algorithm for the DDPMM, and provides a hard clustering with convergence guarantees similar to those of the k-means algorithm. Empirical results from a synthetic test with moving Gaussian clusters and a test with real ADS-B aircraft trajectory data demonstrate that the algorithm requires orders of magnitude less computational time than contemporary probabilistic and hard clustering algorithms, while providing higher accuracy on the examined datasets.Comment: This paper is from NIPS 2013. Please use the following BibTeX citation: @inproceedings{Campbell13_NIPS, Author = {Trevor Campbell and Miao Liu and Brian Kulis and Jonathan P. How and Lawrence Carin}, Title = {Dynamic Clustering via Asymptotics of the Dependent Dirichlet Process}, Booktitle = {Advances in Neural Information Processing Systems (NIPS)}, Year = {2013}
    • …
    corecore