7,524 research outputs found
Construction of Dependent Dirichlet Processes Based on Poisson Processes
We present a method for constructing dependent Dirichlet processes. The new approach
exploits the intrinsic relationship between Dirichlet and Poisson processes
in order to create a Markov chain of Dirichlet processes suitable for use as a prior
over evolving mixture models. The method allows for the creation, removal, and
location variation of component models over time while maintaining the property
that the random measures are marginally DP distributed. Additionally, we derive
a Gibbs sampling algorithm for model inference and test it on both synthetic and
real data. Empirical results demonstrate that the approach is effective in estimating
dynamically varying mixture models
Inverse clustering of Gibbs Partitions via independent fragmentation and dual dependent coagulation operators
Gibbs partitions of the integers generated by stable subordinators of index
form remarkable classes of random partitions where in
principle much is known about their properties, including practically
effortless obtainment of otherwise complex asymptotic results potentially
relevant to applications in general combinatorial stochastic processes, random
tree/graph growth models and Bayesian statistics. This class includes the
well-known models based on the two-parameter Poisson-Dirichlet distribution
which forms the bulk of explicit applications. This work continues efforts to
provide interpretations for a larger classes of Gibbs partitions by embedding
important operations within this framework. Here we address the formidable
problem of extending the dual, infinite-block, coagulation/fragmentation
results of Jim Pitman (1999, Annals of Probability), where in terms of
coagulation they are based on independent two-parameter Poisson-Dirichlet
distributions, to all such Gibbs (stable Poisson-Kingman) models. Our results
create nested families of Gibbs partitions, and corresponding mass partitions,
over any We primarily focus on the fragmentation
operations, which remain independent in this setting, and corresponding
remarkable calculations for Gibbs partitions derived from that operation. We
also present definitive results for the dual coagulation operations, now based
on our construction of dependent processes, and demonstrate its relatively
simple application in terms of Mittag-Leffler and generalized gamma models. The
latter demonstrates another approach to recover the duality results in Pitman
(1999)
A unifying representation for a class of dependent random measures
We present a general construction for dependent random measures based on
thinning Poisson processes on an augmented space. The framework is not
restricted to dependent versions of a specific nonparametric model, but can be
applied to all models that can be represented using completely random measures.
Several existing dependent random measures can be seen as specific cases of
this framework. Interesting properties of the resulting measures are derived
and the efficacy of the framework is demonstrated by constructing a
covariate-dependent latent feature model and topic model that obtain superior
predictive performance
Beta-Product Poisson-Dirichlet Processes
Time series data may exhibit clustering over time and, in a multiple time
series context, the clustering behavior may differ across the series. This
paper is motivated by the Bayesian non--parametric modeling of the dependence
between the clustering structures and the distributions of different time
series. We follow a Dirichlet process mixture approach and introduce a new
class of multivariate dependent Dirichlet processes (DDP). The proposed DDP are
represented in terms of vector of stick-breaking processes with dependent
weights. The weights are beta random vectors that determine different and
dependent clustering effects along the dimension of the DDP vector. We discuss
some theoretical properties and provide an efficient Monte Carlo Markov Chain
algorithm for posterior computation. The effectiveness of the method is
illustrated with a simulation study and an application to the United States and
the European Union industrial production indexes
Dynamic density estimation with diffusive Dirichlet mixtures
We introduce a new class of nonparametric prior distributions on the space of
continuously varying densities, induced by Dirichlet process mixtures which
diffuse in time. These select time-indexed random functions without jumps,
whose sections are continuous or discrete distributions depending on the choice
of kernel. The construction exploits the widely used stick-breaking
representation of the Dirichlet process and induces the time dependence by
replacing the stick-breaking components with one-dimensional Wright-Fisher
diffusions. These features combine appealing properties of the model, inherited
from the Wright-Fisher diffusions and the Dirichlet mixture structure, with
great flexibility and tractability for posterior computation. The construction
can be easily extended to multi-parameter GEM marginal states, which include,
for example, the Pitman--Yor process. A full inferential strategy is detailed
and illustrated on simulated and real data.Comment: Published at http://dx.doi.org/10.3150/14-BEJ681 in the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Augment-and-Conquer Negative Binomial Processes
By developing data augmentation methods unique to the negative binomial (NB)
distribution, we unite seemingly disjoint count and mixture models under the NB
process framework. We develop fundamental properties of the models and derive
efficient Gibbs sampling inference. We show that the gamma-NB process can be
reduced to the hierarchical Dirichlet process with normalization, highlighting
its unique theoretical, structural and computational advantages. A variety of
NB processes with distinct sharing mechanisms are constructed and applied to
topic modeling, with connections to existing algorithms, showing the importance
of inferring both the NB dispersion and probability parameters.Comment: Neural Information Processing Systems, NIPS 201
- …