Source at https://ceur-ws.org/.The field of Earth observation is dealing with increasingly large, multimodal data sets. An important processing step consists
of providing these data sets with labels. However, standard label propagation algorithms cannot be applied to multimodal
remote sensing data for two reasons. First, multimodal data is heterogeneous while classic label propagation algorithms
assume a homogeneous network. Second, real-world data can show both homophily (’birds of a feather flock together’) and
heterophily (’opposites attract’) during propagation, while standard algorithms only consider homophily. Both shortcomings
are addressed in this work and the result is a graph-based label propagation algorithm for multimodal data that includes
homophily and/or heterophily. Furthermore, the method is also able to transfer information between uni- and multimodal
data. Experiments on the remote sensing data set of Houston, which contains a LiDAR and a hyperspectral image, show
that our approach ties state-of-the-art methods for classification with an OA of 91.4%, while being more flexible and not
constrained to a specific data set or a specific combination of modalities