We consider distributed optimization by a collection of nodes, each having
access to its own convex function, whose collective goal is to minimize the sum
of the functions. The communications between nodes are described by a
time-varying sequence of directed graphs, which is uniformly strongly
connected. For such communications, assuming that every node knows its
out-degree, we develop a broadcast-based algorithm, termed the
subgradient-push, which steers every node to an optimal value under a standard
assumption of subgradient boundedness. The subgradient-push requires no
knowledge of either the number of agents or the graph sequence to implement.
Our analysis shows that the subgradient-push algorithm converges at a rate of
O(ln(t)/t), where the constant depends on the initial values at the
nodes, the subgradient norms, and, more interestingly, on both the consensus
speed and the imbalances of influence among the nodes