1 research outputs found
A Transformer-based Embedding Model for Personalized Product Search
Product search is an important way for people to browse and purchase items on
E-commerce platforms. While customers tend to make choices based on their
personal tastes and preferences, analysis of commercial product search logs has
shown that personalization does not always improve product search quality. Most
existing product search techniques, however, conduct undifferentiated
personalization across search sessions. They either use a fixed coefficient to
control the influence of personalization or let personalization take effect all
the time with an attention mechanism. The only notable exception is the
recently proposed zero-attention model (ZAM) that can adaptively adjust the
effect of personalization by allowing the query to attend to a zero vector.
Nonetheless, in ZAM, personalization can act at most as equally important as
the query and the representations of items are static across the collection
regardless of the items co-occurring in the user's historical purchases. Aware
of these limitations, we propose a transformer-based embedding model (TEM) for
personalized product search, which could dynamically control the influence of
personalization by encoding the sequence of query and user's purchase history
with a transformer architecture. Personalization could have a dominant impact
when necessary and interactions between items can be taken into consideration
when computing attention weights. Experimental results show that TEM
outperforms state-of-the-art personalization product retrieval models
significantly.Comment: In the proceedings of SIGIR 202