1 research outputs found

    A Distributed Algorithm for Solving Linear Algebraic Equations Over Random Networks

    Full text link
    In this paper, we consider the problem of solving linear algebraic equations of the form Ax=bAx=b among multi agents which seek a solution by using local information in presence of random communication topologies. The equation is solved by mm agents where each agent only knows a subset of rows of the partitioned matrix [A,b][A,b]. We formulate the problem such that this formulation does not need the distribution of random interconnection graphs. Therefore, this framework includes asynchronous updates or unreliable communication protocols without B-connectivity assumption. We apply the random Krasnoselskii-Mann iterative algorithm which converges almost surely and in mean square to a solution of the problem for any matrices AA and bb and any initial conditions of agents' states. We demonestrate that the limit point to which the agents' states converge is determined by the unique solution of a convex optimization problem regardless of the distribution of random communication graphs. Eventually, we show by two numerical examples that the rate of convergence of the algorithm cannot be guaranteed.Comment: 10 pages, 2 figures, a preliminary version of this paper appears without proofs in the Proceedings of the 57th IEEE Conference on Decision and Control, Miami Beach, FL, USA, December 17-19, 201
    corecore