270 research outputs found
Discrete Factorization Machines for Fast Feature-based Recommendation
User and item features of side information are crucial for accurate
recommendation. However, the large number of feature dimensions, e.g., usually
larger than 10^7, results in expensive storage and computational cost. This
prohibits fast recommendation especially on mobile applications where the
computational resource is very limited. In this paper, we develop a generic
feature-based recommendation model, called Discrete Factorization Machine
(DFM), for fast and accurate recommendation. DFM binarizes the real-valued
model parameters (e.g., float32) of every feature embedding into binary codes
(e.g., boolean), and thus supports efficient storage and fast user-item score
computation. To avoid the severe quantization loss of the binarization, we
propose a convergent updating rule that resolves the challenging discrete
optimization of DFM. Through extensive experiments on two real-world datasets,
we show that 1) DFM consistently outperforms state-of-the-art binarized
recommendation models, and 2) DFM shows very competitive performance compared
to its real-valued version (FM), demonstrating the minimized quantization loss.
This work is accepted by IJCAI 2018.Comment: Appeared in IJCAI 201
Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks
Factorization Machines (FMs) are a supervised learning approach that enhances
the linear regression model by incorporating the second-order feature
interactions. Despite effectiveness, FM can be hindered by its modelling of all
feature interactions with the same weight, as not all feature interactions are
equally useful and predictive. For example, the interactions with useless
features may even introduce noises and adversely degrade the performance. In
this work, we improve FM by discriminating the importance of different feature
interactions. We propose a novel model named Attentional Factorization Machine
(AFM), which learns the importance of each feature interaction from data via a
neural attention network. Extensive experiments on two real-world datasets
demonstrate the effectiveness of AFM. Empirically, it is shown on regression
task AFM betters FM with a relative improvement, and consistently
outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross
with a much simpler structure and fewer model parameters. Our implementation of
AFM is publicly available at:
https://github.com/hexiangnan/attentional_factorization_machineComment: 7 pages, 5 figure
- …