2 research outputs found
A Multimodal Recommender System for Large-scale Assortment Generation in E-commerce
E-commerce platforms surface interesting products largely through product
recommendations that capture users' styles and aesthetic preferences. Curating
recommendations as a complete complementary set, or assortment, is critical for
a successful e-commerce experience, especially for product categories such as
furniture, where items are selected together with the overall theme, style or
ambiance of a space in mind. In this paper, we propose two visually-aware
recommender systems that can automatically curate an assortment of living room
furniture around a couple of pre-selected seed pieces for the room. The first
system aims to maximize the visual-based style compatibility of the entire
selection by making use of transfer learning and topic modeling. The second
system extends the first by incorporating text data and applying polylingual
topic modeling to infer style over both modalities. We review the production
pipeline for surfacing these visually-aware recommender systems and compare
them through offline validations and large-scale online A/B tests on Overstock.
Our experimental results show that complimentary style is best discovered over
product sets when both visual and textual data are incorporated.Comment: SIGIR eComm Accepted Pape
"Does it come in black?" CLIP-like models are zero-shot recommenders
Product discovery is a crucial component for online shopping. However,
item-to-item recommendations today do not allow users to explore changes along
selected dimensions: given a query item, can a model suggest something similar
but in a different color? We consider item recommendations of the comparative
nature (e.g. "something darker") and show how CLIP-based models can support
this use case in a zero-shot manner. Leveraging a large model built for
fashion, we introduce GradREC and its industry potential, and offer a first
rounded assessment of its strength and weaknesses.Comment: Accepted at ACL 2022 (ECNLP