This paper presents DAHiTrA, a novel deep-learning model with hierarchical
transformers to classify building damages based on satellite images in the
aftermath of hurricanes. An automated building damage assessment provides
critical information for decision making and resource allocation for rapid
emergency response. Satellite imagery provides real-time, high-coverage
information and offers opportunities to inform large-scale post-disaster
building damage assessment. In addition, deep-learning methods have shown to be
promising in classifying building damage. In this work, a novel
transformer-based network is proposed for assessing building damage. This
network leverages hierarchical spatial features of multiple resolutions and
captures temporal difference in the feature domain after applying a transformer
encoder on the spatial features. The proposed network achieves
state-of-the-art-performance when tested on a large-scale disaster damage
dataset (xBD) for building localization and damage classification, as well as
on LEVIR-CD dataset for change detection tasks. In addition, we introduce a new
high-resolution satellite imagery dataset, Ida-BD (related to the 2021
Hurricane Ida in Louisiana in 2021, for domain adaptation to further evaluate
the capability of the model to be applied to newly damaged areas with scarce
data. The domain adaptation results indicate that the proposed model can be
adapted to a new event with only limited fine-tuning. Hence, the proposed model
advances the current state of the art through better performance and domain
adaptation. Also, Ida-BD provides a higher-resolution annotated dataset for
future studies in this field