In this paper, we consider the unconstrained distributed optimization
problem, in which the exchange of information in the network is captured by a
directed graph topology, and thus nodes can send information to their
out-neighbors only. Additionally, the communication channels among the nodes
have limited bandwidth, to alleviate the limitation, quantized messages should
be exchanged among the nodes. For solving the distributed optimization problem,
we combine a distributed quantized consensus algorithm (which requires the
nodes to exchange quantized messages and converges in a finite number of steps)
with a gradient descent method. Specifically, at every optimization step, each
node performs a gradient descent step (i.e., subtracts the scaled gradient from
its current estimate), and then performs a finite-time calculation of the
quantized average of every node's estimate in the network. As a consequence,
this algorithm approximately mimics the centralized gradient descent algorithm.
The performance of the proposed algorithm is demonstrated via simple
illustrative examples