1,258 research outputs found

    Gossip Algorithms for Distributed Signal Processing

    Full text link
    Gossip algorithms are attractive for in-network processing in sensor networks because they do not require any specialized routing, there is no bottleneck or single point of failure, and they are robust to unreliable wireless network conditions. Recently, there has been a surge of activity in the computer science, control, signal processing, and information theory communities, developing faster and more robust gossip algorithms and deriving theoretical performance guarantees. This article presents an overview of recent work in the area. We describe convergence rate results, which are related to the number of transmitted messages and thus the amount of energy consumed in the network for gossiping. We discuss issues related to gossiping over wireless links, including the effects of quantization and noise, and we illustrate the use of gossip algorithms for canonical signal processing tasks including distributed estimation, source localization, and compression.Comment: Submitted to Proceedings of the IEEE, 29 page

    Robust and Communication-Efficient Collaborative Learning

    Get PDF
    We consider a decentralized learning problem, where a set of computing nodes aim at solving a non-convex optimization problem collaboratively. It is well-known that decentralized optimization schemes face two major system bottlenecks: stragglers' delay and communication overhead. In this paper, we tackle these bottlenecks by proposing a novel decentralized and gradient-based optimization algorithm named as QuanTimed-DSGD. Our algorithm stands on two main ideas: (i) we impose a deadline on the local gradient computations of each node at each iteration of the algorithm, and (ii) the nodes exchange quantized versions of their local models. The first idea robustifies to straggling nodes and the second alleviates communication efficiency. The key technical contribution of our work is to prove that with non-vanishing noises for quantization and stochastic gradients, the proposed method exactly converges to the global optimal for convex loss functions, and finds a first-order stationary point in non-convex scenarios. Our numerical evaluations of the QuanTimed-DSGD on training benchmark datasets, MNIST and CIFAR-10, demonstrate speedups of up to 3x in run-time, compared to state-of-the-art decentralized optimization methods

    Fast Discrete Consensus Based on Gossip for Makespan Minimization in Networked Systems

    Get PDF
    In this paper we propose a novel algorithm to solve the discrete consensus problem, i.e., the problem of distributing evenly a set of tokens of arbitrary weight among the nodes of a networked system. Tokens are tasks to be executed by the nodes and the proposed distributed algorithm minimizes monotonically the makespan of the assigned tasks. The algorithm is based on gossip-like asynchronous local interactions between the nodes. The convergence time of the proposed algorithm is superior with respect to the state of the art of discrete and quantized consensus by at least a factor O(n) in both theoretical and empirical comparisons

    An Upper Bound on the Convergence Time for Quantized Consensus of Arbitrary Static Graphs

    Full text link
    We analyze a class of distributed quantized consensus algorithms for arbitrary static networks. In the initial setting, each node in the network has an integer value. Nodes exchange their current estimate of the mean value in the network, and then update their estimation by communicating with their neighbors in a limited capacity channel in an asynchronous clock setting. Eventually, all nodes reach consensus with quantized precision. We analyze the expected convergence time for the general quantized consensus algorithm proposed by Kashyap et al \cite{Kashyap}. We use the theory of electric networks, random walks, and couplings of Markov chains to derive an O(N3logN)O(N^3\log N) upper bound for the expected convergence time on an arbitrary graph of size NN, improving on the state of art bound of O(N5)O(N^5) for quantized consensus algorithms. Our result is not dependent on graph topology. Example of complete graphs is given to show how to extend the analysis to graphs of given topology.Comment: to appear in IEEE Trans. on Automatic Control, January, 2015. arXiv admin note: substantial text overlap with arXiv:1208.078
    corecore