Article thumbnail

Learning Latent Vector Spaces for Product Search

By C. Van Gysel, M. de Rijke and E. Kanoulas

Abstract

We introduce a novel latent vector space model that jointly learns the latent representations of words, e-commerce products and a mapping between the two without the need for explicit annotations. The power of the model lies in its ability to directly model the discriminative relation between products and a particular word. We compare our method to existing latent vector space models (LSI, LDA and word2vec) and evaluate it as a feature in a learning to rank setting. Our latent vector space model achieves its enhanced performance as it learns better product representations. Furthermore, the mapping from words to products and the representations of words benefit directly from the errors propagated back from the product representations during parameter estimation. We provide an in-depth analysis of the performance of our model and analyze the structure of the learned representations

Publisher: 'Association for Computing Machinery (ACM)'
Year: 2016
DOI identifier: 10.1145/2983323.2983702
OAI identifier:
Provided by: NARCIS
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • https://dare.uva.nl/personal/p... (external link)

  • To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.

    Suggested articles