2 research outputs found
ZenLDA: An Efficient and Scalable Topic Model Training System on Distributed Data-Parallel Platform
This paper presents our recent efforts, zenLDA, an efficient and scalable
Collapsed Gibbs Sampling system for Latent Dirichlet Allocation training, which
is thought to be challenging that both data parallelism and model parallelism
are required because of the Big sampling data with up to billions of documents
and Big model size with up to trillions of parameters. zenLDA combines both
algorithm level improvements and system level optimizations. It first presents
a novel CGS algorithm that balances the time complexity, model accuracy and
parallelization flexibility. The input corpus in zenLDA is represented as a
directed graph and model parameters are annotated as the corresponding vertex
attributes. The distributed training is parallelized by partitioning the graph
that in each iteration it first applies CGS step for all partitions in
parallel, followed by synchronizing the computed model each other. In this way,
both data parallelism and model parallelism are achieved by converting them to
graph parallelism. We revisited the tradeoff between system efficiency and
model accuracy and presented approximations such as unsynchronized model,
sparse model initialization and "converged" token exclusion. zenLDA is built on
GraphX in Spark that provides distributed data abstraction (RDD) and expressive
APIs to simplify the programming efforts and simultaneously hides the system
complexities. This enables us to implement other CGS algorithm with a few lines
of code change. To better fit in distributed data-parallel framework and
achieve comparable performance with contemporary systems, we also presented
several system level optimizations to push the performance limit. zenLDA was
evaluated it against web-scale corpus, and the result indicates that zenLDA can
achieve about much better performance than other CGS algorithm we implemented,
and simultaneously achieve better model accuracy.Comment: 11 pages, 10 figures. arXiv admin note: text overlap with
arXiv:1412.4986 by other author
The Hitchhiker's Guide to LDA
Latent Dirichlet Allocation (LDA) model is a famous model in the topic model
field, it has been studied for years due to its extensive application value in
industry and academia. However, the mathematical derivation of LDA model is
challenging and difficult, which makes it difficult for the beginners to learn.
To help the beginners in learning LDA, this book analyzes the mathematical
derivation of LDA in detail, and it also introduces all the knowledge
background to make it easy for beginners to understand. Thus, this book
contains the author's unique insights. It should be noted that this book is
written in Chinese.Comment: 148 pages, in Chines