2,796 research outputs found
Stochastic Divergence Minimization for Biterm Topic Model
As the emergence and the thriving development of social networks, a huge
number of short texts are accumulated and need to be processed. Inferring
latent topics of collected short texts is useful for understanding its hidden
structure and predicting new contents. Unlike conventional topic models such as
latent Dirichlet allocation (LDA), a biterm topic model (BTM) was recently
proposed for short texts to overcome the sparseness of document-level word
co-occurrences by directly modeling the generation process of word pairs.
Stochastic inference algorithms based on collapsed Gibbs sampling (CGS) and
collapsed variational inference have been proposed for BTM. However, they
either require large computational complexity, or rely on very crude
estimation. In this work, we develop a stochastic divergence minimization
inference algorithm for BTM to estimate latent topics more accurately in a
scalable way. Experiments demonstrate the superiority of our proposed algorithm
compared with existing inference algorithms.Comment: 19 pages, 4 figure
Efficient Correlated Topic Modeling with Topic Embedding
Correlated topic modeling has been limited to small model and problem sizes
due to their high computational cost and poor scaling. In this paper, we
propose a new model which learns compact topic embeddings and captures topic
correlations through the closeness between the topic vectors. Our method
enables efficient inference in the low-dimensional embedding space, reducing
previous cubic or quadratic time complexity to linear w.r.t the topic size. We
further speedup variational inference with a fast sampler to exploit sparsity
of topic occurrence. Extensive experiments show that our approach is capable of
handling model and data scales which are several orders of magnitude larger
than existing correlation results, without sacrificing modeling quality by
providing competitive or superior performance in document classification and
retrieval.Comment: KDD 2017 oral. The first two authors contributed equall
- …