Few-shot node classification, which aims to predict labels for nodes on
graphs with only limited labeled nodes as references, is of great significance
in real-world graph mining tasks. Particularly, in this paper, we refer to the
task of classifying nodes in classes with a few labeled nodes as the few-shot
node classification problem. To tackle such a label shortage issue, existing
works generally leverage the meta-learning framework, which utilizes a number
of episodes to extract transferable knowledge from classes with abundant
labeled nodes and generalizes the knowledge to other classes with limited
labeled nodes. In essence, the primary aim of few-shot node classification is
to learn node embeddings that are generalizable across different classes. To
accomplish this, the GNN encoder must be able to distinguish node embeddings
between different classes, while also aligning embeddings for nodes in the same
class. Thus, in this work, we propose to consider both the intra-class and
inter-class generalizability of the model. We create a novel contrastive
meta-learning framework on graphs, named COSMIC, with two key designs. First,
we propose to enhance the intra-class generalizability by involving a
contrastive two-step optimization in each episode to explicitly align node
embeddings in the same classes. Second, we strengthen the inter-class
generalizability by generating hard node classes via a novel
similarity-sensitive mix-up strategy. Extensive experiments on few-shot node
classification datasets verify the superiority of our framework over
state-of-the-art baselines. Our code is provided at
https://github.com/SongW-SW/COSMIC.Comment: SIGKDD 202