Defect-aware Super-resolution Thermography by Adversarial Learning

Abstract

nfrared thermography is a valuable non-destructive tool for inspection of materials. It measures the surface temperature evolution, from which hidden defects may be detected. Yet, thermal cameras typically have a low native spatial resolution resulting in a blurry and low-quality thermal image sequence and videos. In this study, a novel adversarial deep learning framework, called Dual-IRT-GAN, is proposed for performing super-resolution tasks. The proposed Dual-IRT-GAN attempts to achieve the objective of improving local texture details, as well as highlighting defective regions. Technically speaking, the proposed model consists of two modules SEGnet and SRnet that carry out defect detection and super resolution tasks, respectively. By leveraging the defect information from SEGnet, SRnet is capable of generating plausible high-resolution thermal images with an enhanced focus on defect regions. The generated high-resolution images are then delivered to the discriminator for adversarial training using GAN's framework. The proposed Dual-IRT-GAN model, which is trained on an exclusive virtual dataset, is demonstrated on experimental thermographic data obtained from fiber reinforced polymers having a variety of defect types, sizes, and depths. The obtained results show its high performance in maintaining background color consistency and removing undesired noise, and in highlighting defect zones with finer detailed textures in high-resolution

    Similar works