59,627 research outputs found
FROST -- Fast row-stochastic optimization with uncoordinated step-sizes
In this paper, we discuss distributed optimization over directed graphs,
where doubly-stochastic weights cannot be constructed. Most of the existing
algorithms overcome this issue by applying push-sum consensus, which utilizes
column-stochastic weights. The formulation of column-stochastic weights
requires each agent to know (at least) its out-degree, which may be impractical
in e.g., broadcast-based communication protocols. In contrast, we describe
FROST (Fast Row-stochastic-Optimization with uncoordinated STep-sizes), an
optimization algorithm applicable to directed graphs that does not require the
knowledge of out-degrees; the implementation of which is straightforward as
each agent locally assigns weights to the incoming information and locally
chooses a suitable step-size. We show that FROST converges linearly to the
optimal solution for smooth and strongly-convex functions given that the
largest step-size is positive and sufficiently small.Comment: Submitted for journal publication, currently under revie
A geometrically converging dual method for distributed optimization over time-varying graphs
In this paper we consider a distributed convex optimization problem over
time-varying undirected networks. We propose a dual method, primarily averaged
network dual ascent (PANDA), that is proven to converge R-linearly to the
optimal point given that the agents objective functions are strongly convex and
have Lipschitz continuous gradients. Like dual decomposition, PANDA requires
half the amount of variable exchanges per iterate of methods based on DIGing,
and can provide with practical improved performance as empirically
demonstrated.Comment: Submitted to Transactions on Automatic Contro
- …