37,972 research outputs found
Multi-Behavior Hypergraph-Enhanced Transformer for Sequential Recommendation
Learning dynamic user preference has become an increasingly important
component for many online platforms (e.g., video-sharing sites, e-commerce
systems) to make sequential recommendations. Previous works have made many
efforts to model item-item transitions over user interaction sequences, based
on various architectures, e.g., recurrent neural networks and self-attention
mechanism. Recently emerged graph neural networks also serve as useful backbone
models to capture item dependencies in sequential recommendation scenarios.
Despite their effectiveness, existing methods have far focused on item sequence
representation with singular type of interactions, and thus are limited to
capture dynamic heterogeneous relational structures between users and items
(e.g., page view, add-to-favorite, purchase). To tackle this challenge, we
design a Multi-Behavior Hypergraph-enhanced Transformer framework (MBHT) to
capture both short-term and long-term cross-type behavior dependencies.
Specifically, a multi-scale Transformer is equipped with low-rank
self-attention to jointly encode behavior-aware sequential patterns from
fine-grained and coarse-grained levels. Additionally, we incorporate the global
multi-behavior dependency into the hypergraph neural architecture to capture
the hierarchical long-range item correlations in a customized manner.
Experimental results demonstrate the superiority of our MBHT over various
state-of-the-art recommendation solutions across different settings. Further
ablation studies validate the effectiveness of our model design and benefits of
the new MBHT framework. Our implementation code is released at:
https://github.com/yuh-yang/MBHT-KDD22.Comment: Published as a KDD'22 full pape
Lifelong Sequential Modeling with Personalized Memorization for User Response Prediction
User response prediction, which models the user preference w.r.t. the
presented items, plays a key role in online services. With two-decade rapid
development, nowadays the cumulated user behavior sequences on mature Internet
service platforms have become extremely long since the user's first
registration. Each user not only has intrinsic tastes, but also keeps changing
her personal interests during lifetime. Hence, it is challenging to handle such
lifelong sequential modeling for each individual user. Existing methodologies
for sequential modeling are only capable of dealing with relatively recent user
behaviors, which leaves huge space for modeling long-term especially lifelong
sequential patterns to facilitate user modeling. Moreover, one user's behavior
may be accounted for various previous behaviors within her whole online
activity history, i.e., long-term dependency with multi-scale sequential
patterns. In order to tackle these challenges, in this paper, we propose a
Hierarchical Periodic Memory Network for lifelong sequential modeling with
personalized memorization of sequential patterns for each user. The model also
adopts a hierarchical and periodical updating mechanism to capture multi-scale
sequential patterns of user interests while supporting the evolving user
behavior logs. The experimental results over three large-scale real-world
datasets have demonstrated the advantages of our proposed model with
significant improvement in user response prediction performance against the
state-of-the-arts.Comment: SIGIR 2019. Reproducible codes and datasets:
https://github.com/alimamarankgroup/HPM
- …