This paper focuses on addressing the practical yet challenging problem of
model heterogeneity in federated learning, where clients possess models with
different network structures. To track this problem, we propose a novel
framework called pFedHR, which leverages heterogeneous model reassembly to
achieve personalized federated learning. In particular, we approach the problem
of heterogeneous model personalization as a model-matching optimization task on
the server side. Moreover, pFedHR automatically and dynamically generates
informative and diverse personalized candidates with minimal human
intervention. Furthermore, our proposed heterogeneous model reassembly
technique mitigates the adverse impact introduced by using public data with
different distributions from the client data to a certain extent. Experimental
results demonstrate that pFedHR outperforms baselines on three datasets under
both IID and Non-IID settings. Additionally, pFedHR effectively reduces the
adverse impact of using different public data and dynamically generates diverse
personalized models in an automated manner