'Institute of Electrical and Electronics Engineers (IEEE)'
Abstract
We consider a single-antenna broadcast block fading channel (downlink scheduling) with n users where the transmission is packet-based and all users are backlogged. We define the delay as the minimum number of channel uses that guarantees all n users successfully receive m packets. This is a more stringent notion of delay than average delay and is the worst case delay among the users. A delay optimal scheduling scheme, such as round-robin, achieves the delay of mn. In a heterogeneous network and for the optimal throughput strategy where the transmitter sends the packet to the user with the best channel conditions, we derive the moment generating function of the delay for any m and n. For large n and in a homogeneous network, the expected delay in receiving one packet by all the receivers scales as n log n, as opposed to n for the round-robin scheduling. We also show that when m grows faster than (log n)^r, for some r > 1, then the expected value of delay scales like mn. This roughly determines the time-scale required for the system
to behave fairly in a homogeneous network. We then propose a
scheme to significantly reduce the delay at the expense of a small throughput hit.
We further look into two generalizations of our work: i) the
effect of temporal channel correlation and ii) the advantage of multiple transmit antennas on the delay. For a channel with memory of two, we prove that the delay scales again like n log n no matter how severe the correlation is. For a system with M transmit antennas, we prove that the expected delay in receiving one packet by all the users scales like (n log n)/(M +O((M^2)/n) for large n and when M is not growing faster than log n. Thus, when the temporal channel correlation is zero, multiple transmit antenna systems do not reduce the delay significantly. However, when channel correlation is present, they can lead to significant gains
by “decorrelating” the effective channel through means such as random beamforming