Graph Neural Networks (GNNs) have garnered considerable interest due to their
exceptional performance in a wide range of graph machine learning tasks.
Nevertheless, the majority of GNN-based approaches have been examined using
well-annotated benchmark datasets, leading to suboptimal performance in
real-world graph learning scenarios. To bridge this gap, the present paper
investigates the problem of graph transfer learning in the presence of label
noise, which transfers knowledge from a noisy source graph to an unlabeled
target graph. We introduce a novel technique termed Balance Alignment and
Information-aware Examination (ALEX) to address this challenge. ALEX first
employs singular value decomposition to generate different views with crucial
structural semantics, which help provide robust node representations using
graph contrastive learning. To mitigate both label shift and domain shift, we
estimate a prior distribution to build subgraphs with balanced label
distributions. Building on this foundation, an adversarial domain discriminator
is incorporated for the implicit domain alignment of complex multi-modal
distributions. Furthermore, we project node representations into a different
space, optimizing the mutual information between the projected features and
labels. Subsequently, the inconsistency of similarity structures is evaluated
to identify noisy samples with potential overfitting. Comprehensive experiments
on various benchmark datasets substantiate the outstanding superiority of the
proposed ALEX in different settings.Comment: Accepted by the ACM International Conference on Multimedia (MM) 202