An Efficient Federated Learning Framework for Training Semantic Communication System

Abstract

Semantic communication has emerged as a pillar forthe next generation of communication systems due to its capabilities in alleviating data redundancy. Most semantic communication systems are built upon advanced deep learning models whosetraining performance heavily relies on data availability. Existingstudies often make unrealistic assumptions of a readily accessibledata source, where in practice, data is mainly created on the clientside. Due to privacy and security concerns, the transmission ofdata is restricted, which is necessary for conventional centralizedtraining schemes. To address this challenge, we explore semanticcommunication in a federated learning (FL) setting that utilizesclient data without leaking privacy. Additionally, we designour system to tackle the communication overhead by reducingthe quantity of information delivered in each global round.In this way, we can save significant bandwidth for resourcelimited devices and reduce overall network traffic. Finally, weintroduce a mechanism to aggregate the global model fromclients, called FedLol. Extensive simulation results demonstratethe effectiveness of our proposed technique compared to baselinemethods

    Similar works