This study provides a new understanding of the adversarial attack problem by
examining the correlation between adversarial attack and visual attention
change. In particular, we observed that: (1) images with incomplete attention
regions are more vulnerable to adversarial attacks; and (2) successful
adversarial attacks lead to deviated and scattered attention map. Accordingly,
an attention-based adversarial defense framework is designed to simultaneously
rectify the attention map for prediction and preserve the attention area
between adversarial and original images. The problem of adding iteratively
attacked samples is also discussed in the context of visual attention change.
We hope the attention-related data analysis and defense solution in this study
will shed some light on the mechanism behind the adversarial attack and also
facilitate future adversarial defense/attack model design