2,169 research outputs found
Relphormer: Relational Graph Transformer for Knowledge Graph Representations
Transformers have achieved remarkable performance in widespread fields,
including natural language processing, computer vision and graph mining.
However, vanilla Transformer architectures have not yielded promising
improvements in the Knowledge Graph (KG) representations, where the
translational distance paradigm dominates this area. Note that vanilla
Transformer architectures struggle to capture the intrinsically heterogeneous
structural and semantic information of knowledge graphs. To this end, we
propose a new variant of Transformer for knowledge graph representations dubbed
Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample
contextualized sub-graph sequences as the input to alleviate the heterogeneity
issue. We propose a novel structure-enhanced self-attention mechanism to encode
the relational information and keep the semantic information within entities
and relations. Moreover, we utilize masked knowledge modeling for general
knowledge graph representation learning, which can be applied to various
KG-based tasks including knowledge graph completion, question answering, and
recommendation. Experimental results on six datasets show that Relphormer can
obtain better performance compared with baselines. Code is available in
https://github.com/zjunlp/Relphormer.Comment: Work in progres
Self-Supervised Hypergraph Convolutional Networks for Session-based Recommendation
Session-based recommendation (SBR) focuses on next-item prediction at a
certain time point. As user profiles are generally not available in this
scenario, capturing the user intent lying in the item transitions plays a
pivotal role. Recent graph neural networks (GNNs) based SBR methods regard the
item transitions as pairwise relations, which neglect the complex high-order
information among items. Hypergraph provides a natural way to capture
beyond-pairwise relations, while its potential for SBR has remained unexplored.
In this paper, we fill this gap by modeling session-based data as a hypergraph
and then propose a hypergraph convolutional network to improve SBR. Moreover,
to enhance hypergraph modeling, we devise another graph convolutional network
which is based on the line graph of the hypergraph and then integrate
self-supervised learning into the training of the networks by maximizing mutual
information between the session representations learned via the two networks,
serving as an auxiliary task to improve the recommendation task. Since the two
types of networks both are based on hypergraph, which can be seen as two
channels for hypergraph modeling, we name our model \textbf{DHCN} (Dual Channel
Hypergraph Convolutional Networks). Extensive experiments on three benchmark
datasets demonstrate the superiority of our model over the SOTA methods, and
the results validate the effectiveness of hypergraph modeling and
self-supervised task. The implementation of our model is available at
https://github.com/xiaxin1998/DHCNComment: 9 pages, 4 figures, accepted by AAAI'2
A Survey on Knowledge Graphs: Representation, Acquisition and Applications
Human knowledge provides a formal understanding of the world. Knowledge
graphs that represent structural relations between entities have become an
increasingly popular research direction towards cognition and human-level
intelligence. In this survey, we provide a comprehensive review of knowledge
graph covering overall research topics about 1) knowledge graph representation
learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph,
and 4) knowledge-aware applications, and summarize recent breakthroughs and
perspective directions to facilitate future research. We propose a full-view
categorization and new taxonomies on these topics. Knowledge graph embedding is
organized from four aspects of representation space, scoring function, encoding
models, and auxiliary information. For knowledge acquisition, especially
knowledge graph completion, embedding methods, path inference, and logical rule
reasoning, are reviewed. We further explore several emerging topics, including
meta relational learning, commonsense reasoning, and temporal knowledge graphs.
To facilitate future research on knowledge graphs, we also provide a curated
collection of datasets and open-source libraries on different tasks. In the
end, we have a thorough outlook on several promising research directions
- …