1 research outputs found
Combinatorial Topic Models using Small-Variance Asymptotics
Topic models have emerged as fundamental tools in unsupervised machine
learning. Most modern topic modeling algorithms take a probabilistic view and
derive inference algorithms based on Latent Dirichlet Allocation (LDA) or its
variants. In contrast, we study topic modeling as a combinatorial optimization
problem, and propose a new objective function derived from LDA by passing to
the small-variance limit. We minimize the derived objective by using ideas from
combinatorial optimization, which results in a new, fast, and high-quality
topic modeling algorithm. In particular, we show that our results are
competitive with popular LDA-based topic modeling approaches, and also discuss
the (dis)similarities between our approach and its probabilistic counterparts.Comment: 19 page