242 research outputs found
The Approximate Capacity of the Gaussian N-Relay Diamond Network
We consider the Gaussian "diamond" or parallel relay network, in which a
source node transmits a message to a destination node with the help of N
relays. Even for the symmetric setting, in which the channel gains to the
relays are identical and the channel gains from the relays are identical, the
capacity of this channel is unknown in general. The best known capacity
approximation is up to an additive gap of order N bits and up to a
multiplicative gap of order N^2, with both gaps independent of the channel
gains.
In this paper, we approximate the capacity of the symmetric Gaussian N-relay
diamond network up to an additive gap of 1.8 bits and up to a multiplicative
gap of a factor 14. Both gaps are independent of the channel gains and, unlike
the best previously known result, are also independent of the number of relays
N in the network. Achievability is based on bursty amplify-and-forward, showing
that this simple scheme is uniformly approximately optimal, both in the
low-rate as well as in the high-rate regimes. The upper bound on capacity is
based on a careful evaluation of the cut-set bound. We also present
approximation results for the asymmetric Gaussian N-relay diamond network. In
particular, we show that bursty amplify-and-forward combined with optimal relay
selection achieves a rate within a factor O(log^4(N)) of capacity with
pre-constant in the order notation independent of the channel gains.Comment: 23 pages, to appear in IEEE Transactions on Information Theor
On Multistage Successive Refinement for Wyner-Ziv Source Coding with Degraded Side Informations
We provide a complete characterization of the rate-distortion region for the
multistage successive refinement of the Wyner-Ziv source coding problem with
degraded side informations at the decoder. Necessary and sufficient conditions
for a source to be successively refinable along a distortion vector are
subsequently derived. A source-channel separation theorem is provided when the
descriptions are sent over independent channels for the multistage case.
Furthermore, we introduce the notion of generalized successive refinability
with multiple degraded side informations. This notion captures whether
progressive encoding to satisfy multiple distortion constraints for different
side informations is as good as encoding without progressive requirement.
Necessary and sufficient conditions for generalized successive refinability are
given. It is shown that the following two sources are generalized successively
refinable: (1) the Gaussian source with degraded Gaussian side informations,
(2) the doubly symmetric binary source when the worse side information is a
constant. Thus for both cases, the failure of being successively refinable is
only due to the inherent uncertainty on which side information will occur at
the decoder, but not the progressive encoding requirement.Comment: Submitted to IEEE Trans. Information Theory Apr. 200
Opportunistic Scheduling for Full-Duplex Uplink-Downlink Networks
We study opportunistic scheduling and the sum capacity of cellular networks
with a full-duplex multi-antenna base station and a large number of
single-antenna half-duplex users. Simultaneous uplink and downlink over the
same band results in uplink-to-downlink interference, degrading performance. We
present a simple opportunistic joint uplink-downlink scheduling algorithm that
exploits multiuser diversity and treats interference as noise. We show that in
homogeneous networks, our algorithm achieves the same sum capacity as what
would have been achieved if there was no uplink-to-downlink interference,
asymptotically in the number of users. The algorithm does not require
interference CSI at the base station or uplink users. It is also shown that for
a simple class of heterogeneous networks without sufficient channel diversity,
it is not possible to achieve the corresponding interference-free system
capacity. We discuss the potential for using device-to-device side-channels to
overcome this limitation in heterogeneous networks.Comment: 10 pages, 2 figures, to appear at IEEE International Symposium on
Information Theory (ISIT) '1
Side-information Scalable Source Coding
The problem of side-information scalable (SI-scalable) source coding is
considered in this work, where the encoder constructs a progressive
description, such that the receiver with high quality side information will be
able to truncate the bitstream and reconstruct in the rate distortion sense,
while the receiver with low quality side information will have to receive
further data in order to decode. We provide inner and outer bounds for general
discrete memoryless sources. The achievable region is shown to be tight for the
case that either of the decoders requires a lossless reconstruction, as well as
the case with degraded deterministic distortion measures. Furthermore we show
that the gap between the achievable region and the outer bounds can be bounded
by a constant when square error distortion measure is used. The notion of
perfectly scalable coding is introduced as both the stages operate on the
Wyner-Ziv bound, and necessary and sufficient conditions are given for sources
satisfying a mild support condition. Using SI-scalable coding and successive
refinement Wyner-Ziv coding as basic building blocks, a complete
characterization is provided for the important quadratic Gaussian source with
multiple jointly Gaussian side-informations, where the side information quality
does not have to be monotonic along the scalable coding order. Partial result
is provided for the doubly symmetric binary source with Hamming distortion when
the worse side information is a constant, for which one of the outer bound is
strictly tighter than the other one.Comment: 35 pages, submitted to IEEE Transaction on Information Theor
- …