Graph classification, aiming at learning the graph-level representations for
effective class assignments, has received outstanding achievements, which
heavily relies on high-quality datasets that have balanced class distribution.
In fact, most real-world graph data naturally presents a long-tailed form,
where the head classes occupy much more samples than the tail classes, it thus
is essential to study the graph-level classification over long-tailed data
while still remaining largely unexplored. However, most existing long-tailed
learning methods in visions fail to jointly optimize the representation
learning and classifier training, as well as neglect the mining of the
hard-to-classify classes. Directly applying existing methods to graphs may lead
to sub-optimal performance, since the model trained on graphs would be more
sensitive to the long-tailed distribution due to the complex topological
characteristics. Hence, in this paper, we propose a novel long-tailed
graph-level classification framework via Collaborative Multi-expert Learning
(CoMe) to tackle the problem. To equilibrate the contributions of head and tail
classes, we first develop balanced contrastive learning from the view of
representation learning, and then design an individual-expert classifier
training based on hard class mining. In addition, we execute gated fusion and
disentangled knowledge distillation among the multiple experts to promote the
collaboration in a multi-expert framework. Comprehensive experiments are
performed on seven widely-used benchmark datasets to demonstrate the
superiority of our method CoMe over state-of-the-art baselines.Comment: Accepted by IEEE Transactions on Big Data (TBD 2024