206 research outputs found
Graph Neural Networks with Generated Parameters for Relation Extraction
Recently, progress has been made towards improving relational reasoning in
machine learning field. Among existing models, graph neural networks (GNNs) is
one of the most effective approaches for multi-hop relational reasoning. In
fact, multi-hop relational reasoning is indispensable in many natural language
processing tasks such as relation extraction. In this paper, we propose to
generate the parameters of graph neural networks (GP-GNNs) according to natural
language sentences, which enables GNNs to process relational reasoning on
unstructured text inputs. We verify GP-GNNs in relation extraction from text.
Experimental results on a human-annotated dataset and two distantly supervised
datasets show that our model achieves significant improvements compared to
baselines. We also perform a qualitative analysis to demonstrate that our model
could discover more accurate relations by multi-hop relational reasoning
Learning Relation Prototype from Unlabeled Texts for Long-tail Relation Extraction
Relation Extraction (RE) is a vital step to complete Knowledge Graph (KG) by
extracting entity relations from texts.However, it usually suffers from the
long-tail issue. The training data mainly concentrates on a few types of
relations, leading to the lackof sufficient annotations for the remaining types
of relations. In this paper, we propose a general approach to learn relation
prototypesfrom unlabeled texts, to facilitate the long-tail relation extraction
by transferring knowledge from the relation types with sufficient trainingdata.
We learn relation prototypes as an implicit factor between entities, which
reflects the meanings of relations as well as theirproximities for transfer
learning. Specifically, we construct a co-occurrence graph from texts, and
capture both first-order andsecond-order entity proximities for embedding
learning. Based on this, we further optimize the distance from entity pairs
tocorresponding prototypes, which can be easily adapted to almost arbitrary RE
frameworks. Thus, the learning of infrequent or evenunseen relation types will
benefit from semantically proximate relations through pairs of entities and
large-scale textual information.We have conducted extensive experiments on two
publicly available datasets: New York Times and Google Distant
Supervision.Compared with eight state-of-the-art baselines, our proposed model
achieves significant improvements (4.1% F1 on average). Furtherresults on
long-tail relations demonstrate the effectiveness of the learned relation
prototypes. We further conduct an ablation study toinvestigate the impacts of
varying components, and apply it to four basic relation extraction models to
verify the generalization ability.Finally, we analyze several example cases to
give intuitive impressions as qualitative analysis. Our codes will be released
later
A Survey on Knowledge Graphs: Representation, Acquisition and Applications
Human knowledge provides a formal understanding of the world. Knowledge
graphs that represent structural relations between entities have become an
increasingly popular research direction towards cognition and human-level
intelligence. In this survey, we provide a comprehensive review of knowledge
graph covering overall research topics about 1) knowledge graph representation
learning, 2) knowledge acquisition and completion, 3) temporal knowledge graph,
and 4) knowledge-aware applications, and summarize recent breakthroughs and
perspective directions to facilitate future research. We propose a full-view
categorization and new taxonomies on these topics. Knowledge graph embedding is
organized from four aspects of representation space, scoring function, encoding
models, and auxiliary information. For knowledge acquisition, especially
knowledge graph completion, embedding methods, path inference, and logical rule
reasoning, are reviewed. We further explore several emerging topics, including
meta relational learning, commonsense reasoning, and temporal knowledge graphs.
To facilitate future research on knowledge graphs, we also provide a curated
collection of datasets and open-source libraries on different tasks. In the
end, we have a thorough outlook on several promising research directions
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
Distantly supervised relation extraction intrinsically suffers from noisy
labels due to the strong assumption of distant supervision. Most prior works
adopt a selective attention mechanism over sentences in a bag to denoise from
wrongly labeled data, which however could be incompetent when there is only one
sentence in a bag. In this paper, we propose a brand-new light-weight neural
framework to address the distantly supervised relation extraction problem and
alleviate the defects in previous selective attention framework. Specifically,
in the proposed framework, 1) we use an entity-aware word embedding method to
integrate both relative position information and head/tail entity embeddings,
aiming to highlight the essence of entities for this task; 2) we develop a
self-attention mechanism to capture the rich contextual dependencies as a
complement for local dependencies captured by piecewise CNN; and 3) instead of
using selective attention, we design a pooling-equipped gate, which is based on
rich contextual representations, as an aggregator to generate bag-level
representation for final relation classification. Compared to selective
attention, one major advantage of the proposed gating mechanism is that, it
performs stably and promisingly even if only one sentence appears in a bag and
thus keeps the consistency across all training examples. The experiments on NYT
dataset demonstrate that our approach achieves a new state-of-the-art
performance in terms of both AUC and top-n precision metrics
MatSciRE: Leveraging Pointer Networks to Automate Entity and Relation Extraction for Material Science Knowledge-base Construction
Material science literature is a rich source of factual information about
various categories of entities (like materials and compositions) and various
relations between these entities, such as conductivity, voltage, etc.
Automatically extracting this information to generate a material science
knowledge base is a challenging task. In this paper, we propose MatSciRE
(Material Science Relation Extractor), a Pointer Network-based encoder-decoder
framework, to jointly extract entities and relations from material science
articles as a triplet (). Specifically, we target
the battery materials and identify five relations to work on - conductivity,
coulombic efficiency, capacity, voltage, and energy. Our proposed approach
achieved a much better F1-score (0.771) than a previous attempt using
ChemDataExtractor (0.716). The overall graphical framework of MatSciRE is shown
in Fig 1. The material information is extracted from material science
literature in the form of entity-relation triplets using MatSciRE
- …