DNN-based PolSAR image classification on noisy labels

Abstract

Deep neural networks (DNNs) appear to be a solution for the classification of polarimetric synthetic aperture radar (PolSAR) data in that they outperform classical supervised classifiers under the condition of sufficient training samples. The design of a classifier is challenging because DNNs can easily overfit due to limited remote sensing training samples and unavoidable noisy labels. In this article, a softmax loss strategy with antinoise capability, namely, the probability-aware sample grading strategy (PASGS), is developed to overcome this limitation. Combined with the proposed softmax loss strategy, two classical DNN-based classifiers are implemented to perform PolSAR image classification to demonstrate its effectiveness. In this framework, the difference distribution implicitly reflects the probability that a training sample is clean, and clean labels can be distinguished from noisy labels according to the method of probability statistics. Then, this probability is employed to reweight the corresponding loss of each training sample during the training process to locate the noisy data and to prevent participation in the loss calculation of the neural network. As the number of training iterations increases, the condition of the probability statistics of the noisy labels will be constantly adjusted without supervision, and the clean labels will eventually be identified to train the neural network. Experiments on three PolSAR datasets with two DNN-based methods also demonstrate that the proposed method is superior to state-of-the-art methods.This work was supported in part by the National Natural Science Foundation of China under Grant 61871413 and Grant 61801015, in part by the Fundamental Research Funds for the Central Universities under Grant XK2020-03, in part by China Scholarship Council under Grant 2020006880033, and in part by Grant PID2020-114623RB-C32 funded by MCIN/AEI/10.13039/501100011033.Peer ReviewedPostprint (published version

    Similar works