1,986 research outputs found
Accelerated Gossip in Networks of Given Dimension using Jacobi Polynomial Iterations
Consider a network of agents connected by communication links, where each
agent holds a real value. The gossip problem consists in estimating the average
of the values diffused in the network in a distributed manner. We develop a
method solving the gossip problem that depends only on the spectral dimension
of the network, that is, in the communication network set-up, the dimension of
the space in which the agents live. This contrasts with previous work that
required the spectral gap of the network as a parameter, or suffered from slow
mixing. Our method shows an important improvement over existing algorithms in
the non-asymptotic regime, i.e., when the values are far from being fully mixed
in the network. Our approach stems from a polynomial-based point of view on
gossip algorithms, as well as an approximation of the spectral measure of the
graphs with a Jacobi measure. We show the power of the approach with
simulations on various graphs, and with performance guarantees on graphs of
known spectral dimension, such as grids and random percolation bonds. An
extension of this work to distributed Laplacian solvers is discussed. As a side
result, we also use the polynomial-based point of view to show the convergence
of the message passing algorithm for gossip of Moallemi \& Van Roy on regular
graphs. The explicit computation of the rate of the convergence shows that
message passing has a slow rate of convergence on graphs with small spectral
gap
Accelerated Consensus via Min-Sum Splitting
We apply the Min-Sum message-passing protocol to solve the consensus problem
in distributed optimization. We show that while the ordinary Min-Sum algorithm
does not converge, a modified version of it known as Splitting yields
convergence to the problem solution. We prove that a proper choice of the
tuning parameters allows Min-Sum Splitting to yield subdiffusive accelerated
convergence rates, matching the rates obtained by shift-register methods. The
acceleration scheme embodied by Min-Sum Splitting for the consensus problem
bears similarities with lifted Markov chains techniques and with multi-step
first order methods in convex optimization
Optimal Algorithms for Non-Smooth Distributed Optimization in Networks
In this work, we consider the distributed optimization of non-smooth convex
functions using a network of computing units. We investigate this problem under
two regularity assumptions: (1) the Lipschitz continuity of the global
objective function, and (2) the Lipschitz continuity of local individual
functions. Under the local regularity assumption, we provide the first optimal
first-order decentralized algorithm called multi-step primal-dual (MSPD) and
its corresponding optimal convergence rate. A notable aspect of this result is
that, for non-smooth functions, while the dominant term of the error is in
, the structure of the communication network only impacts a
second-order term in , where is time. In other words, the error due
to limits in communication resources decreases at a fast rate even in the case
of non-strongly-convex objective functions. Under the global regularity
assumption, we provide a simple yet efficient algorithm called distributed
randomized smoothing (DRS) based on a local smoothing of the objective
function, and show that DRS is within a multiplicative factor of the
optimal convergence rate, where is the underlying dimension.Comment: 17 page
- …