In recent years, graph contrastive learning (GCL) has emerged as one of the
optimal solutions for various supervised tasks at the node level. However, for
unsupervised and structure-related tasks such as community detection, current
GCL algorithms face difficulties in acquiring the necessary community-level
information, resulting in poor performance. In addition, general contrastive
learning algorithms improve the performance of downstream tasks by increasing
the number of negative samples, which leads to severe class collision and
unfairness of community detection. To address above issues, we propose a novel
Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to
jointly learn community partition and node representations in an end-to-end
manner. Specifically, we first design a personalized self-training (PeST)
strategy for unsupervised scenarios, which enables our model to capture precise
community-level personalized information in a graph. With the benefit of the
PeST, we alleviate class collision and unfairness without sacrificing the
overall model performance. Furthermore, the aligned graph clustering (AlGC) is
employed to obtain the community partition. In this module, we align the
clustering space of our downstream task with that in PeST to achieve more
consistent node embeddings. Finally, we demonstrate the effectiveness of our
model for community detection both theoretically and experimentally. Extensive
experimental results also show that our CEGCL exhibits state-of-the-art
performance on three benchmark datasets with different scales.Comment: 12 pages, 7 figure