Distributed optimization methods with local updates have recently attracted a
lot of attention due to their potential to reduce the communication cost of
distributed methods. In these algorithms, a collection of nodes performs
several local updates based on their local data, and then they communicate with
each other to exchange estimate information. While there have been many studies
on distributed local methods with centralized network connections, there has
been less work on decentralized networks.
In this work, we propose and investigate a locally updated decentralized
method called Local Exact-Diffusion (LED). We establish the convergence of LED
in both convex and nonconvex settings for the stochastic online setting. Our
convergence rate improves over the rate of existing decentralized methods. When
we specialize the network to the centralized case, we recover the
state-of-the-art bound for centralized methods. We also link LED to several
other independently studied distributed methods, including Scaffnew, FedGate,
and VRL-SGD. Additionally, we numerically investigate the benefits of local
updates for decentralized networks and demonstrate the effectiveness of the
proposed method