27 research outputs found

    Distributed Optimization Using the Primal-Dual Method of Multipliers

    Full text link
    © 2015 IEEE. In this paper, we propose the primal-dual method of multipliers (PDMM) for distributed optimization over a graph. In particular, we optimize a sum of convex functions defined over a graph, where every edge in the graph carries a linear equality constraint. In designing the new algorithm, an augmented primal-dual Lagrangian function is constructed which smoothly captures the graph topology. It is shown that a saddle point of the constructed function provides an optimal solution of the original problem. Further under both the synchronous and asynchronous updating schemes, PDMM has the convergence rate of O(1/K) (where K denotes the iteration index) for general closed, proper, and convex functions. Other properties of PDMM such as convergence speeds versus different parameter-settings and resilience to transmission failure are also investigated through the experiments of distributed averaging

    Privacy-Preserving Distributed Average Consensus based on Additive Secret Sharing

    Get PDF

    A Privacy-Preserving Asynchronous Averaging Algorithm based on Shamir’s Secret Sharing

    Get PDF

    Convex optimization-based Privacy-Preserving Distributed Least Squares via Subspace Perturbation

    Get PDF

    Towards accelerated rates for distributed optimization over time-varying networks

    Full text link
    We study the problem of decentralized optimization over time-varying networks with strongly convex smooth cost functions. In our approach, nodes run a multi-step gossip procedure after making each gradient update, thus ensuring approximate consensus at each iteration, while the outer loop is based on accelerated Nesterov scheme. The algorithm achieves precision ε>0\varepsilon > 0 in O(κgχlog2(1/ε))O(\sqrt{\kappa_g}\chi\log^2(1/\varepsilon)) communication steps and O(κglog(1/ε))O(\sqrt{\kappa_g}\log(1/\varepsilon)) gradient computations at each node, where κg\kappa_g is the global function number and χ\chi characterizes connectivity of the communication network. In the case of a static network, χ=1/γ\chi = 1/\gamma where γ\gamma denotes the normalized spectral gap of communication matrix W\mathbf{W}. The complexity bound includes κg\kappa_g, which can be significantly better than the worst-case condition number among the nodes

    Model-Based Voice Activity Detection in Wireless Acoustic Sensor Networks

    Get PDF

    Privacy-Preserving Distributed Optimization via Subspace Perturbation: A General Framework

    Get PDF
    As the modern world becomes increasingly digitized and interconnected, distributed signal processing has proven to be effective in processing its large volume of data. However, a main challenge limiting the broad use of distributed signal processing techniques is the issue of privacy in handling sensitive data. To address this privacy issue, we propose a novel yet general subspace perturbation method for privacy-preserving distributed optimization, which allows each node to obtain the desired solution while protecting its private data. In particular, we show that the dual variables introduced in each distributed optimizer will not converge in a certain subspace determined by the graph topology. Additionally, the optimization variable is ensured to converge to the desired solution, because it is orthogonal to this non-convergent subspace. We therefore propose to insert noise in the non-convergent subspace through the dual variable such that the private data are protected, and the accuracy of the desired solution is completely unaffected. Moreover, the proposed method is shown to be secure under two widely-used adversary models: passive and eavesdropping. Furthermore, we consider several distributed optimizers such as ADMM and PDMM to demonstrate the general applicability of the proposed method. Finally, we test the performance through a set of applications. Numerical tests indicate that the proposed method is superior to existing methods in terms of several parameters like estimated accuracy, privacy level, communication cost and convergence rate

    Privacy-Preserving Distributed Processing Over Networks

    Get PDF
    corecore