User representation learning is vital to capture diverse user preferences,
while it is also challenging as user intents are latent and scattered among
complex and different modalities of user-generated data, thus, not directly
measurable. Inspired by the concept of user schema in social psychology, we
take a new perspective to perform user representation learning by constructing
a shared latent space to capture the dependency among different modalities of
user-generated data. Both users and topics are embedded to the same space to
encode users' social connections and text content, to facilitate joint modeling
of different modalities, via a probabilistic generative framework. We evaluated
the proposed solution on large collections of Yelp reviews and StackOverflow
discussion posts, with their associated network structures. The proposed model
outperformed several state-of-the-art topic modeling based user models with
better predictive power in unseen documents, and state-of-the-art network
embedding based user models with improved link prediction quality in unseen
nodes. The learnt user representations are also proved to be useful in content
recommendation, e.g., expert finding in StackOverflow