FedGiA: An Efficient Hybrid Algorithm for Federated Learning

Abstract

Federated learning has shown its advances recently but is still facing many challenges, such as how algorithms save communication resources and reduce computational costs, and whether they converge. To address these critical issues, we propose a hybrid federated learning algorithm (FedGiA) that combines the gradient descent and the inexact alternating direction method of multipliers. The proposed algorithm is more communication- and computation-efficient than several state-of-the-art algorithms theoretically and numerically. Moreover, it also converges globally under mild conditions.Comment: arXiv admin note: substantial text overlap with arXiv:2110.15318; text overlap with arXiv:2204.1060

    Similar works

    Full text

    thumbnail-image

    Available Versions