26 research outputs found
Towards Deeper Graph Neural Networks
Graph neural networks have shown significant success in the field of graph
representation learning. Graph convolutions perform neighborhood aggregation
and represent one of the most important graph operations. Nevertheless, one
layer of these neighborhood aggregation methods only consider immediate
neighbors, and the performance decreases when going deeper to enable larger
receptive fields. Several recent studies attribute this performance
deterioration to the over-smoothing issue, which states that repeated
propagation makes node representations of different classes indistinguishable.
In this work, we study this observation systematically and develop new insights
towards deeper graph neural networks. First, we provide a systematical analysis
on this issue and argue that the key factor compromising the performance
significantly is the entanglement of representation transformation and
propagation in current graph convolution operations. After decoupling these two
operations, deeper graph neural networks can be used to learn graph node
representations from larger receptive fields. We further provide a theoretical
analysis of the above observation when building very deep models, which can
serve as a rigorous and gentle description of the over-smoothing issue. Based
on our theoretical and empirical analysis, we propose Deep Adaptive Graph
Neural Network (DAGNN) to adaptively incorporate information from large
receptive fields. A set of experiments on citation, co-authorship, and
co-purchase datasets have confirmed our analysis and insights and demonstrated
the superiority of our proposed methods.Comment: 11 pages, KDD202
Relation Extraction with Self-determined Graph Convolutional Network
Relation Extraction is a way of obtaining the semantic relationship between
entities in text. The state-of-the-art methods use linguistic tools to build a
graph for the text in which the entities appear and then a Graph Convolutional
Network (GCN) is employed to encode the pre-built graphs. Although their
performance is promising, the reliance on linguistic tools results in a non
end-to-end process. In this work, we propose a novel model, the Self-determined
Graph Convolutional Network (SGCN), which determines a weighted graph using a
self-attention mechanism, rather using any linguistic tool. Then, the
self-determined graph is encoded using a GCN. We test our model on the TACRED
dataset and achieve the state-of-the-art result. Our experiments show that SGCN
outperforms the traditional GCN, which uses dependency parsing tools to build
the graph.Comment: CIKM-202