213 research outputs found
Deep Graph Neural Networks via Flexible Subgraph Aggregation
Graph neural networks (GNNs), a type of neural network that can learn from
graph-structured data and learn the representation of nodes through aggregating
neighborhood information, have shown superior performance in various downstream
tasks. However, it is known that the performance of GNNs degrades gradually as
the number of layers increases. In this paper, we evaluate the expressive power
of GNNs from the perspective of subgraph aggregation. We reveal the potential
cause of performance degradation for traditional deep GNNs, i.e., aggregated
subgraph overlap, and we theoretically illustrate the fact that previous
residual-based GNNs exploit the aggregation results of 1 to hop subgraphs
to improve the effectiveness. Further, we find that the utilization of
different subgraphs by previous models is often inflexible. Based on this, we
propose a sampling-based node-level residual module (SNR) that can achieve a
more flexible utilization of different hops of subgraph aggregation by
introducing node-level parameters sampled from a learnable distribution.
Extensive experiments show that the performance of GNNs with our proposed SNR
module outperform a comprehensive set of baselines
- …