9,708 research outputs found
Interaction-aware Factorization Machines for Recommender Systems
Factorization Machine (FM) is a widely used supervised learning approach by
effectively modeling of feature interactions. Despite the successful
application of FM and its many deep learning variants, treating every feature
interaction fairly may degrade the performance. For example, the interactions
of a useless feature may introduce noises; the importance of a feature may also
differ when interacting with different features. In this work, we propose a
novel model named \emph{Interaction-aware Factorization Machine} (IFM) by
introducing Interaction-Aware Mechanism (IAM), which comprises the
\emph{feature aspect} and the \emph{field aspect}, to learn flexible
interactions on two levels. The feature aspect learns feature interaction
importance via an attention network while the field aspect learns the feature
interaction effect as a parametric similarity of the feature interaction vector
and the corresponding field interaction prototype. IFM introduces more
structured control and learns feature interaction importance in a stratified
manner, which allows for more leverage in tweaking the interactions on both
feature-wise and field-wise levels. Besides, we give a more generalized
architecture and propose Interaction-aware Neural Network (INN) and DeepIFM to
capture higher-order interactions. To further improve both the performance and
efficiency of IFM, a sampling scheme is developed to select interactions based
on the field aspect importance. The experimental results from two well-known
datasets show the superiority of the proposed models over the state-of-the-art
methods
FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction
Advertising and feed ranking are essential to many Internet companies such as
Facebook and Sina Weibo. Among many real-world advertising and feed ranking
systems, click through rate (CTR) prediction plays a central role. There are
many proposed models in this field such as logistic regression, tree based
models, factorization machine based models and deep learning based CTR models.
However, many current works calculate the feature interactions in a simple way
such as Hadamard product and inner product and they care less about the
importance of features. In this paper, a new model named FiBiNET as an
abbreviation for Feature Importance and Bilinear feature Interaction NETwork is
proposed to dynamically learn the feature importance and fine-grained feature
interactions. On the one hand, the FiBiNET can dynamically learn the importance
of features via the Squeeze-Excitation network (SENET) mechanism; on the other
hand, it is able to effectively learn the feature interactions via bilinear
function. We conduct extensive experiments on two real-world datasets and show
that our shallow model outperforms other shallow models such as factorization
machine(FM) and field-aware factorization machine(FFM). In order to improve
performance further, we combine a classical deep neural network(DNN) component
with the shallow model to be a deep model. The deep FiBiNET consistently
outperforms the other state-of-the-art deep models such as DeepFM and extreme
deep factorization machine(XdeepFM).Comment: 8 pages,5 figure
Network On Network for Tabular Data Classification in Real-world Applications
Tabular data is the most common data format adopted by our customers ranging
from retail, finance to E-commerce, and tabular data classification plays an
essential role to their businesses. In this paper, we present Network On
Network (NON), a practical tabular data classification model based on deep
neural network to provide accurate predictions. Various deep methods have been
proposed and promising progress has been made. However, most of them use
operations like neural network and factorization machines to fuse the
embeddings of different features directly, and linearly combine the outputs of
those operations to get the final prediction. As a result, the intra-field
information and the non-linear interactions between those operations (e.g.
neural network and factorization machines) are ignored. Intra-field information
is the information that features inside each field belong to the same field.
NON is proposed to take full advantage of intra-field information and
non-linear interactions. It consists of three components: field-wise network at
the bottom to capture the intra-field information, across field network in the
middle to choose suitable operations data-drivenly, and operation fusion
network on the top to fuse outputs of the chosen operations deeply. Extensive
experiments on six real-world datasets demonstrate NON can outperform the
state-of-the-art models significantly. Furthermore, both qualitative and
quantitative study of the features in the embedding space show NON can capture
intra-field information effectively
Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks
Factorization Machines (FMs) are a supervised learning approach that enhances
the linear regression model by incorporating the second-order feature
interactions. Despite effectiveness, FM can be hindered by its modelling of all
feature interactions with the same weight, as not all feature interactions are
equally useful and predictive. For example, the interactions with useless
features may even introduce noises and adversely degrade the performance. In
this work, we improve FM by discriminating the importance of different feature
interactions. We propose a novel model named Attentional Factorization Machine
(AFM), which learns the importance of each feature interaction from data via a
neural attention network. Extensive experiments on two real-world datasets
demonstrate the effectiveness of AFM. Empirically, it is shown on regression
task AFM betters FM with a relative improvement, and consistently
outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross
with a much simpler structure and fewer model parameters. Our implementation of
AFM is publicly available at:
https://github.com/hexiangnan/attentional_factorization_machineComment: 7 pages, 5 figure
- …