40,573 research outputs found
Interaction Embeddings for Prediction and Explanation in Knowledge Graphs
Knowledge graph embedding aims to learn distributed representations for
entities and relations, and is proven to be effective in many applications.
Crossover interactions --- bi-directional effects between entities and
relations --- help select related information when predicting a new triple, but
haven't been formally discussed before. In this paper, we propose CrossE, a
novel knowledge graph embedding which explicitly simulates crossover
interactions. It not only learns one general embedding for each entity and
relation as most previous methods do, but also generates multiple triple
specific embeddings for both of them, named interaction embeddings. We evaluate
embeddings on typical link prediction tasks and find that CrossE achieves
state-of-the-art results on complex and more challenging datasets. Furthermore,
we evaluate embeddings from a new perspective --- giving explanations for
predicted triples, which is important for real applications. In this work, an
explanation for a triple is regarded as a reliable closed-path between the head
and the tail entity. Compared to other baselines, we show experimentally that
CrossE, benefiting from interaction embeddings, is more capable of generating
reliable explanations to support its predictions.Comment: This paper is accepted by WSDM201
Modeling Relation Paths for Representation Learning of Knowledge Bases
Representation learning of knowledge bases (KBs) aims to embed both entities
and relations into a low-dimensional space. Most existing methods only consider
direct relations in representation learning. We argue that multiple-step
relation paths also contain rich inference patterns between entities, and
propose a path-based representation learning model. This model considers
relation paths as translations between entities for representation learning,
and addresses two key challenges: (1) Since not all relation paths are
reliable, we design a path-constraint resource allocation algorithm to measure
the reliability of relation paths. (2) We represent relation paths via semantic
composition of relation embeddings. Experimental results on real-world datasets
show that, as compared with baselines, our model achieves significant and
consistent improvements on knowledge base completion and relation extraction
from text.Comment: 10 page
One-Shot Relational Learning for Knowledge Graphs
Knowledge graphs (KGs) are the key components of various natural language
processing applications. To further expand KGs' coverage, previous studies on
knowledge graph completion usually require a large number of training instances
for each relation. However, we observe that long-tail relations are actually
more common in KGs and those newly added relations often do not have many known
triples for training. In this work, we aim at predicting new facts under a
challenging setting where only one training instance is available. We propose a
one-shot relational learning framework, which utilizes the knowledge extracted
by embedding models and learns a matching metric by considering both the
learned embeddings and one-hop graph structures. Empirically, our model yields
considerable performance improvements over existing embedding models, and also
eliminates the need of re-training the embedding models when dealing with newly
added relations.Comment: EMNLP 201
Modeling relation paths for knowledge base completion via joint adversarial training
Knowledge Base Completion (KBC), which aims at determining the missing
relations between entity pairs, has received increasing attention in recent
years. Most existing KBC methods focus on either embedding the Knowledge Base
(KB) into a specific semantic space or leveraging the joint probability of
Random Walks (RWs) on multi-hop paths. Only a few unified models take both
semantic and path-related features into consideration with adequacy. In this
paper, we propose a novel method to explore the intrinsic relationship between
the single relation (i.e. 1-hop path) and multi-hop paths between paired
entities. We use Hierarchical Attention Networks (HANs) to select important
relations in multi-hop paths and encode them into low-dimensional vectors. By
treating relations and multi-hop paths as two different input sources, we use a
feature extractor, which is shared by two downstream components (i.e. relation
classifier and source discriminator), to capture shared/similar information
between them. By joint adversarial training, we encourage our model to extract
features from the multi-hop paths which are representative for relation
completion. We apply the trained model (except for the source discriminator) to
several large-scale KBs for relation completion. Experimental results show that
our method outperforms existing path information-based approaches. Since each
sub-module of our model can be well interpreted, our model can be applied to a
large number of relation learning tasks.Comment: Accepted by Knowledge-Based System
- …