17,950 research outputs found
LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension
Incorporating prior knowledge can improve existing pre-training models in
cloze-style machine reading and has become a new trend in recent studies.
Notably, most of the existing models have integrated external knowledge graphs
(KG) and transformer-based models, such as BERT into a unified data structure.
However, selecting the most relevant ambiguous entities in KG and extracting
the best subgraph remains a challenge. In this paper, we propose the
LUKE-Graph, a model that builds a heterogeneous graph based on the intuitive
relationships between entities in a document without using any external KG. We
then use a Relational Graph Attention (RGAT) network to fuse the graph's
reasoning information and the contextual representation encoded by the
pre-trained LUKE model. In this way, we can take advantage of LUKE, to derive
an entity-aware representation; and a graph model - to exploit relation-aware
representation. Moreover, we propose Gated-RGAT by augmenting RGAT with a
gating mechanism that regulates the question information for the graph
convolution operation. This is very similar to human reasoning processing
because they always choose the best entity candidate based on the question
information. Experimental results demonstrate that the LUKE-Graph achieves
state-of-the-art performance on the ReCoRD dataset with commonsense reasoning.Comment: submitted for neurocomputing journa
- …