An Equalized Margin Loss for Face Recognition

Abstract

In this paper, we propose a new loss function, termed the equalized margin (EqM) loss, which is designed to make both intra-class scopes and inter-class margins similar over all classes, such that all the classes can be evenly distributed on the hypersphere of the feature space. The EqM loss controls both the lower limit of intra-class similarity by exploiting hard sample mining and the upper limit of inter-class similarity by assuring equalized margins. Therefore, using the EqM loss, we can not only obtain more discriminative features, but also overcome the negative impacts from the data imbalance on the inter-class margins. We also observe that the EqM loss is stable with the variation of the scale in normalized Softmax. Furthermore, by conducting extensive experiments on LFW, YTF, CFP, MegaFace and IJB-B, we are able to verify the effectiveness and superiority of the EqM loss, compared with other state-of-the- art loss functions for face recogniti

    Similar works