Symmetric Positive Definite (SPD) matrices have received wide attention in
machine learning due to their intrinsic capacity of encoding underlying
structural correlation in data. To reflect the non-Euclidean geometry of SPD
manifolds, many successful Riemannian metrics have been proposed. However,
existing fixed metric tensors might lead to sub-optimal performance for SPD
matrices learning, especially for SPD neural networks. To remedy this
limitation, we leverage the idea of pullback and propose adaptive Riemannian
metrics for SPD manifolds. Moreover, we present comprehensive theories for our
metrics. Experiments on three datasets demonstrate that equipped with the
proposed metrics, SPD networks can exhibit superior performance