37,694 research outputs found
Distance Dependent Infinite Latent Feature Models
Latent feature models are widely used to decompose data into a small number
of components. Bayesian nonparametric variants of these models, which use the
Indian buffet process (IBP) as a prior over latent features, allow the number
of features to be determined from the data. We present a generalization of the
IBP, the distance dependent Indian buffet process (dd-IBP), for modeling
non-exchangeable data. It relies on distances defined between data points,
biasing nearby data to share more features. The choice of distance measure
allows for many kinds of dependencies, including temporal and spatial. Further,
the original IBP is a special case of the dd-IBP. In this paper, we develop the
dd-IBP and theoretically characterize its feature-sharing properties. We derive
a Markov chain Monte Carlo sampler for a linear Gaussian model with a dd-IBP
prior and study its performance on several non-exchangeable data sets.Comment: 28 pages, 9 figure
The supervised IBP: neighbourhood preserving infinite latent feature models
We propose a probabilistic model to infer supervised latent variables in the Hamming space from observed data. Our model allows simultaneous inference of the number of binary latent variables, and their values. The latent variables preserve neighbourhood structure of the data in a sense that objects in the same semantic concept have similar latent values, and objects in different concepts have dissimilar latent values. We formulate the supervised infinite latent variable problem based on an intuitive principle of pulling objects together if they are of the same type, and pushing them apart if they are not. We then combine this principle with a flexible Indian Buffet Process prior on the latent variables. We show that the inferred supervised latent variables can be directly used to perform a nearest neighbour search for the purpose of retrieval. We introduce a new application of dynamically extending hash codes, and show how to effectively couple the structure of the hash codes with continuously growing structure of the neighbourhood preserving infinite latent feature space
A unifying representation for a class of dependent random measures
We present a general construction for dependent random measures based on
thinning Poisson processes on an augmented space. The framework is not
restricted to dependent versions of a specific nonparametric model, but can be
applied to all models that can be represented using completely random measures.
Several existing dependent random measures can be seen as specific cases of
this framework. Interesting properties of the resulting measures are derived
and the efficacy of the framework is demonstrated by constructing a
covariate-dependent latent feature model and topic model that obtain superior
predictive performance
- …