Graph Neural Networks (GNNs) have shown exceptional performance in the task
of link prediction. Despite their effectiveness, the high latency brought by
non-trivial neighborhood data dependency limits GNNs in practical deployments.
Conversely, the known efficient MLPs are much less effective than GNNs due to
the lack of relational knowledge. In this work, to combine the advantages of
GNNs and MLPs, we start with exploring direct knowledge distillation (KD)
methods for link prediction, i.e., predicted logit-based matching and node
representation-based matching. Upon observing direct KD analogs do not perform
well for link prediction, we propose a relational KD framework, Linkless Link
Prediction (LLP), to distill knowledge for link prediction with MLPs. Unlike
simple KD methods that match independent link logits or node representations,
LLP distills relational knowledge that is centered around each (anchor) node to
the student MLP. Specifically, we propose rank-based matching and
distribution-based matching strategies that complement each other. Extensive
experiments demonstrate that LLP boosts the link prediction performance of MLPs
with significant margins, and even outperforms the teacher GNNs on 7 out of 8
benchmarks. LLP also achieves a 70.68x speedup in link prediction inference
compared to GNNs on the large-scale OGB dataset