7,416 research outputs found
Representation Learning for Attributed Multiplex Heterogeneous Network
Network embedding (or graph embedding) has been widely used in many
real-world applications. However, existing methods mainly focus on networks
with single-typed nodes/edges and cannot scale well to handle large networks.
Many real-world networks consist of billions of nodes and edges of multiple
types, and each node is associated with different attributes. In this paper, we
formalize the problem of embedding learning for the Attributed Multiplex
Heterogeneous Network and propose a unified framework to address this problem.
The framework supports both transductive and inductive learning. We also give
the theoretical analysis of the proposed framework, showing its connection with
previous works and proving its better expressiveness. We conduct systematical
evaluations for the proposed framework on four different genres of challenging
datasets: Amazon, YouTube, Twitter, and Alibaba. Experimental results
demonstrate that with the learned embeddings from the proposed framework, we
can achieve statistically significant improvements (e.g., 5.99-28.23% lift by
F1 scores; p<<0.01, t-test) over previous state-of-the-art methods for link
prediction. The framework has also been successfully deployed on the
recommendation system of a worldwide leading e-commerce company, Alibaba Group.
Results of the offline A/B tests on product recommendation further confirm the
effectiveness and efficiency of the framework in practice.Comment: Accepted to KDD 2019. Website: https://sites.google.com/view/gatn
Explainable Reasoning over Knowledge Graphs for Recommendation
Incorporating knowledge graph into recommender systems has attracted
increasing attention in recent years. By exploring the interlinks within a
knowledge graph, the connectivity between users and items can be discovered as
paths, which provide rich and complementary information to user-item
interactions. Such connectivity not only reveals the semantics of entities and
relations, but also helps to comprehend a user's interest. However, existing
efforts have not fully explored this connectivity to infer user preferences,
especially in terms of modeling the sequential dependencies within and holistic
semantics of a path. In this paper, we contribute a new model named
Knowledge-aware Path Recurrent Network (KPRN) to exploit knowledge graph for
recommendation. KPRN can generate path representations by composing the
semantics of both entities and relations. By leveraging the sequential
dependencies within a path, we allow effective reasoning on paths to infer the
underlying rationale of a user-item interaction. Furthermore, we design a new
weighted pooling operation to discriminate the strengths of different paths in
connecting a user with an item, endowing our model with a certain level of
explainability. We conduct extensive experiments on two datasets about movie
and music, demonstrating significant improvements over state-of-the-art
solutions Collaborative Knowledge Base Embedding and Neural Factorization
Machine.Comment: 8 pages, 5 figures, AAAI-201
KGAT: Knowledge Graph Attention Network for Recommendation
To provide more accurate, diverse, and explainable recommendation, it is
compulsory to go beyond modeling user-item interactions and take side
information into account. Traditional methods like factorization machine (FM)
cast it as a supervised learning problem, which assumes each interaction as an
independent instance with side information encoded. Due to the overlook of the
relations among instances or items (e.g., the director of a movie is also an
actor of another movie), these methods are insufficient to distill the
collaborative signal from the collective behaviors of users. In this work, we
investigate the utility of knowledge graph (KG), which breaks down the
independent interaction assumption by linking items with their attributes. We
argue that in such a hybrid structure of KG and user-item graph, high-order
relations --- which connect two items with one or multiple linked attributes
--- are an essential factor for successful recommendation. We propose a new
method named Knowledge Graph Attention Network (KGAT) which explicitly models
the high-order connectivities in KG in an end-to-end fashion. It recursively
propagates the embeddings from a node's neighbors (which can be users, items,
or attributes) to refine the node's embedding, and employs an attention
mechanism to discriminate the importance of the neighbors. Our KGAT is
conceptually advantageous to existing KG-based recommendation methods, which
either exploit high-order relations by extracting paths or implicitly modeling
them with regularization. Empirical results on three public benchmarks show
that KGAT significantly outperforms state-of-the-art methods like Neural FM and
RippleNet. Further studies verify the efficacy of embedding propagation for
high-order relation modeling and the interpretability benefits brought by the
attention mechanism.Comment: KDD 2019 research trac
Attributed Multi-order Graph Convolutional Network for Heterogeneous Graphs
Heterogeneous graph neural networks aim to discover discriminative node
embeddings and relations from multi-relational networks.One challenge of
heterogeneous graph learning is the design of learnable meta-paths, which
significantly influences the quality of learned embeddings.Thus, in this paper,
we propose an Attributed Multi-Order Graph Convolutional Network (AMOGCN),
which automatically studies meta-paths containing multi-hop neighbors from an
adaptive aggregation of multi-order adjacency matrices. The proposed model
first builds different orders of adjacency matrices from manually designed node
connections. After that, an intact multi-order adjacency matrix is attached
from the automatic fusion of various orders of adjacency matrices. This process
is supervised by the node semantic information, which is extracted from the
node homophily evaluated by attributes. Eventually, we utilize a one-layer
simplifying graph convolutional network with the learned multi-order adjacency
matrix, which is equivalent to the cross-hop node information propagation with
multi-layer graph neural networks. Substantial experiments reveal that AMOGCN
gains superior semi-supervised classification performance compared with
state-of-the-art competitors
GRAF: Graph Attention-aware Fusion Networks
A large number of real-world networks include multiple types of nodes and
edges. Graph Neural Network (GNN) emerged as a deep learning framework to
utilize node features on graph-structured data showing superior performance.
However, popular GNN-based architectures operate on one homogeneous network.
Enabling them to work on multiple networks brings additional challenges due to
the heterogeneity of the networks and the multiplicity of the existing
associations. In this study, we present a computational approach named GRAF
utilizing GNN-based approaches on multiple networks with the help of attention
mechanisms and network fusion. Using attention-based neighborhood aggregation,
GRAF learns the importance of each neighbor per node (called node-level
attention) followed by the importance of association (called association-level
attention) in a hierarchical way. Then, GRAF processes a network fusion step
weighing each edge according to learned node- and association-level attention,
which results in a fused enriched network. Considering that the fused network
could be a highly dense network with many weak edges depending on the given
input networks, we included an edge elimination step with respect to edges'
weights. Finally, GRAF utilizes Graph Convolutional Network (GCN) on the fused
network and incorporates the node features on the graph-structured data for the
prediction task or any other downstream analysis. Our extensive evaluations of
prediction tasks from different domains showed that GRAF outperformed the
state-of-the-art methods. Utilization of learned node-level and
association-level attention allowed us to prioritize the edges properly. The
source code for our tool is publicly available at
https://github.com/bozdaglab/GRAF.Comment: 11 pages, 1 figur
- …