While Graph Neural Networks (GNNs) have been successfully leveraged for
learning on graph-structured data across domains, several potential pitfalls
have been described recently. Those include the inability to accurately
leverage information encoded in long-range connections (over-squashing), as
well as difficulties distinguishing the learned representations of nearby nodes
with growing network depth (over-smoothing). An effective way to characterize
both effects is discrete curvature: Long-range connections that underlie
over-squashing effects have low curvature, whereas edges that contribute to
over-smoothing have high curvature. This observation has given rise to rewiring
techniques, which add or remove edges to mitigate over-smoothing and
over-squashing. Several rewiring approaches utilizing graph characteristics,
such as curvature or the spectrum of the graph Laplacian, have been proposed.
However, existing methods, especially those based on curvature, often require
expensive subroutines and careful hyperparameter tuning, which limits their
applicability to large-scale graphs. Here we propose a rewiring technique based
on Augmented Forman-Ricci curvature (AFRC), a scalable curvature notation,
which can be computed in linear time. We prove that AFRC effectively
characterizes over-smoothing and over-squashing effects in message-passing
GNNs. We complement our theoretical results with experiments, which demonstrate
that the proposed approach achieves state-of-the-art performance while
significantly reducing the computational cost in comparison with other methods.
Utilizing fundamental properties of discrete curvature, we propose effective
heuristics for hyperparameters in curvature-based rewiring, which avoids
expensive hyperparameter searches, further improving the scalability of the
proposed approach