Deep metric learning aims to construct an embedding space where samples of
the same class are close to each other, while samples of different classes are
far away from each other. Most existing deep metric learning methods attempt to
maximize the difference of inter-class features. And semantic related
information is obtained by increasing the distance between samples of different
classes in the embedding space. However, compressing all positive samples
together while creating large margins between different classes unconsciously
destroys the local structure between similar samples. Ignoring the intra-class
variance contained in the local structure between similar samples, the
embedding space obtained from training receives lower generalizability over
unseen classes, which would lead to the network overfitting the training set
and crashing on the test set. To address these considerations, this paper
designs a self-supervised generative assisted ranking framework that provides a
semi-supervised view of intra-class variance learning scheme for typical
supervised deep metric learning. Specifically, this paper performs sample
synthesis with different intensities and diversity for samples satisfying
certain conditions to simulate the complex transformation of intra-class
samples. And an intra-class ranking loss function is designed using the idea of
self-supervised learning to constrain the network to maintain the intra-class
distribution during the training process to capture the subtle intra-class
variance. With this approach, a more realistic embedding space can be obtained
in which global and local structures of samples are well preserved, thus
enhancing the effectiveness of downstream tasks. Extensive experiments on four
benchmarks have shown that this approach surpasses state-of-the-art method