Message passing has evolved as an effective tool for designing Graph Neural
Networks (GNNs). However, most existing works naively sum or average all the
neighboring features to update node representations, which suffers from the
following limitations: (1) lack of interpretability to identify crucial node
features for GNN's prediction; (2) over-smoothing issue where repeated
averaging aggregates excessive noise, making features of nodes in different
classes over-mixed and thus indistinguishable. In this paper, we propose the
Node-level Capsule Graph Neural Network (NCGNN) to address these issues with an
improved message passing scheme. Specifically, NCGNN represents nodes as groups
of capsules, in which each capsule extracts distinctive features of its
corresponding node. For each node-level capsule, a novel dynamic routing
procedure is developed to adaptively select appropriate capsules for
aggregation from a subgraph identified by the designed graph filter.
Consequently, as only the advantageous capsules are aggregated and harmful
noise is restrained, over-mixing features of interacting nodes in different
classes tends to be avoided to relieve the over-smoothing issue. Furthermore,
since the graph filter and the dynamic routing identify a subgraph and a subset
of node features that are most influential for the prediction of the model,
NCGNN is inherently interpretable and exempt from complex post-hoc
explanations. Extensive experiments on six node classification benchmarks
demonstrate that NCGNN can well address the over-smoothing issue and
outperforms the state of the arts by producing better node embeddings for
classification