4,964 research outputs found
RulE: Neural-Symbolic Knowledge Graph Reasoning with Rule Embedding
Knowledge graph (KG) reasoning is an important problem for knowledge graphs.
It predicts missing links by reasoning on existing facts. Knowledge graph
embedding (KGE) is one of the most popular methods to address this problem. It
embeds entities and relations into low-dimensional vectors and uses the learned
entity/relation embeddings to predict missing facts. However, KGE only uses
zeroth-order (propositional) logic to encode existing triplets (e.g., ``Alice
is Bob's wife."); it is unable to leverage first-order (predicate) logic to
represent generally applicable logical \textbf{rules} (e.g., ``''). On the other hand, traditional rule-based KG reasoning methods
usually rely on hard logical rule inference, making it brittle and hardly
competitive with KGE. In this paper, we propose RulE, a novel and principled
framework to represent and model logical rules and triplets. RulE jointly
represents entities, relations and logical rules in a unified embedding space.
By learning an embedding for each logical rule, RulE can perform logical rule
inference in a soft way and give a confidence score to each grounded rule,
similar to how KGE gives each triplet a confidence score. Compared to KGE
alone, RulE allows injecting prior logical rule information into the embedding
space, which improves the generalization of knowledge graph embedding. Besides,
the learned confidence scores of rules improve the logical rule inference
process by softly controlling the contribution of each rule, which alleviates
the brittleness of logic. We evaluate our method with link prediction tasks.
Experimental results on multiple benchmark KGs demonstrate the effectiveness of
RulE
Incorporating GAN for Negative Sampling in Knowledge Representation Learning
Knowledge representation learning aims at modeling knowledge graph by
encoding entities and relations into a low dimensional space. Most of the
traditional works for knowledge embedding need negative sampling to minimize a
margin-based ranking loss. However, those works construct negative samples
through a random mode, by which the samples are often too trivial to fit the
model efficiently. In this paper, we propose a novel knowledge representation
learning framework based on Generative Adversarial Networks (GAN). In this
GAN-based framework, we take advantage of a generator to obtain high-quality
negative samples. Meanwhile, the discriminator in GAN learns the embeddings of
the entities and relations in knowledge graph. Thus, we can incorporate the
proposed GAN-based framework into various traditional models to improve the
ability of knowledge representation learning. Experimental results show that
our proposed GAN-based framework outperforms baselines on triplets
classification and link prediction tasks.Comment: Accepted to AAAI 201
Multi-task Neural Network for Non-discrete Attribute Prediction in Knowledge Graphs
Many popular knowledge graphs such as Freebase, YAGO or DBPedia maintain a
list of non-discrete attributes for each entity. Intuitively, these attributes
such as height, price or population count are able to richly characterize
entities in knowledge graphs. This additional source of information may help to
alleviate the inherent sparsity and incompleteness problem that are prevalent
in knowledge graphs. Unfortunately, many state-of-the-art relational learning
models ignore this information due to the challenging nature of dealing with
non-discrete data types in the inherently binary-natured knowledge graphs. In
this paper, we propose a novel multi-task neural network approach for both
encoding and prediction of non-discrete attribute information in a relational
setting. Specifically, we train a neural network for triplet prediction along
with a separate network for attribute value regression. Via multi-task
learning, we are able to learn representations of entities, relations and
attributes that encode information about both tasks. Moreover, such attributes
are not only central to many predictive tasks as an information source but also
as a prediction target. Therefore, models that are able to encode, incorporate
and predict such information in a relational learning context are highly
attractive as well. We show that our approach outperforms many state-of-the-art
methods for the tasks of relational triplet classification and attribute value
prediction.Comment: Accepted at CIKM 201
KGAT: Knowledge Graph Attention Network for Recommendation
To provide more accurate, diverse, and explainable recommendation, it is
compulsory to go beyond modeling user-item interactions and take side
information into account. Traditional methods like factorization machine (FM)
cast it as a supervised learning problem, which assumes each interaction as an
independent instance with side information encoded. Due to the overlook of the
relations among instances or items (e.g., the director of a movie is also an
actor of another movie), these methods are insufficient to distill the
collaborative signal from the collective behaviors of users. In this work, we
investigate the utility of knowledge graph (KG), which breaks down the
independent interaction assumption by linking items with their attributes. We
argue that in such a hybrid structure of KG and user-item graph, high-order
relations --- which connect two items with one or multiple linked attributes
--- are an essential factor for successful recommendation. We propose a new
method named Knowledge Graph Attention Network (KGAT) which explicitly models
the high-order connectivities in KG in an end-to-end fashion. It recursively
propagates the embeddings from a node's neighbors (which can be users, items,
or attributes) to refine the node's embedding, and employs an attention
mechanism to discriminate the importance of the neighbors. Our KGAT is
conceptually advantageous to existing KG-based recommendation methods, which
either exploit high-order relations by extracting paths or implicitly modeling
them with regularization. Empirical results on three public benchmarks show
that KGAT significantly outperforms state-of-the-art methods like Neural FM and
RippleNet. Further studies verify the efficacy of embedding propagation for
high-order relation modeling and the interpretability benefits brought by the
attention mechanism.Comment: KDD 2019 research trac
End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion
Knowledge graph embedding has been an active research topic for knowledge
base completion, with progressive improvement from the initial TransE, TransH,
DistMult et al to the current state-of-the-art ConvE. ConvE uses 2D convolution
over embeddings and multiple layers of nonlinear features to model knowledge
graphs. The model can be efficiently trained and scalable to large knowledge
graphs. However, there is no structure enforcement in the embedding space of
ConvE. The recent graph convolutional network (GCN) provides another way of
learning graph node embedding by successfully utilizing graph connectivity
structure. In this work, we propose a novel end-to-end Structure-Aware
Convolutional Network (SACN) that takes the benefit of GCN and ConvE together.
SACN consists of an encoder of a weighted graph convolutional network (WGCN),
and a decoder of a convolutional network called Conv-TransE. WGCN utilizes
knowledge graph node structure, node attributes and edge relation types. It has
learnable weights that adapt the amount of information from neighbors used in
local aggregation, leading to more accurate embeddings of graph nodes. Node
attributes in the graph are represented as additional nodes in the WGCN. The
decoder Conv-TransE enables the state-of-the-art ConvE to be translational
between entities and relations while keeps the same link prediction performance
as ConvE. We demonstrate the effectiveness of the proposed SACN on standard
FB15k-237 and WN18RR datasets, and it gives about 10% relative improvement over
the state-of-the-art ConvE in terms of HITS@1, HITS@3 and [email protected]: The Thirty-Third AAAI Conference on Artificial Intelligence (AAAI
2019
- …