8 research outputs found

    The Capacity of Three-Receiver AWGN Broadcast Channels with Receiver Message Side Information

    Full text link
    This paper investigates the capacity region of three-receiver AWGN broadcast channels where the receivers (i) have private-message requests and (ii) know the messages requested by some other receivers as side information. We classify these channels based on their side information into eight groups, and construct different transmission schemes for the groups. For six groups, we characterize the capacity region, and show that it improves both the best known inner and outer bounds. For the remaining two groups, we improve the best known inner bound by using side information during channel decoding at the receivers.Comment: This is an extended version of the same-titled paper submitted to IEEE International Symposium on Information Theory (ISIT) 201

    A Unified Scheme for Two-Receiver Broadcast Channels with Receiver Message Side Information

    Full text link
    This paper investigates the capacity regions of two-receiver broadcast channels where each receiver (i) has both common and private-message requests, and (ii) knows part of the private message requested by the other receiver as side information. We first propose a transmission scheme and derive an inner bound for the two-receiver memoryless broadcast channel. We next prove that this inner bound is tight for the deterministic channel and the more capable channel, thereby establishing their capacity regions. We show that this inner bound is also tight for all classes of two-receiver broadcast channels whose capacity regions were known prior to this work. Our proposed scheme is consequently a unified capacity-achieving scheme for these classes of broadcast channels.Comment: accepted and to be presented at the 2015 IEEE International Symposium on Information Theory (ISIT 2015

    Joint Network and Gelfand-Pinsker Coding for 3-Receiver Gaussian Broadcast Channels with Receiver Message Side Information

    Full text link
    The problem of characterizing the capacity region for Gaussian broadcast channels with receiver message side information appears difficult and remains open for N >= 3 receivers. This paper proposes a joint network and Gelfand-Pinsker coding method for 3-receiver cases. Using the method, we establish a unified inner bound on the capacity region of 3-receiver Gaussian broadcast channels under general message side information configuration. The achievability proof of the inner bound uses an idea of joint interference cancelation, where interference is canceled by using both dirty-paper coding at the encoder and successive decoding at some of the decoders. We show that the inner bound is larger than that achieved by state of the art coding schemes. An outer bound is also established and shown to be tight in 46 out of all 64 possible cases.Comment: Author's final version (presented at the 2014 IEEE International Symposium on Information Theory [ISIT 2014]

    Coding Schemes for a Class of Receiver Message Side Information in AWGN Broadcast Channels

    Full text link
    This paper considers the three-receiver AWGN broadcast channel where the receivers (i) have private-message requests and (ii) know some of the messages requested by other receivers as side information. For this setup, all possible side information configurations have been recently classified into eight groups and the capacity of the channel has been established for six groups (Asadi et al., ISIT 2014). We propose inner and outer bounds for the two remaining groups, groups 4 and 7. A distinguishing feature of these two groups is that the weakest receiver knows the requested message of the strongest receiver as side information while the in-between receiver does not. For group 4, the inner and outer bounds coincide at certain regions. For group 7, the inner and outer bounds coincide, thereby establishing the capacity, for four members out of all eight members of the group; for the remaining four members, the proposed bounds reduce the gap between the best known inner and outer bounds.Comment: accepted and to be presented at the 2014 IEEE Information Theory Workshop (ITW

    Capacity of Coded Index Modulation

    Full text link
    We consider the special case of index coding over the Gaussian broadcast channel where each receiver has prior knowledge of a subset of messages at the transmitter and demands all the messages from the source. We propose a concatenated coding scheme for this problem, using an index code for the Gaussian channel as an inner code/modulation to exploit side information at the receivers, and an outer code to attain coding gain against the channel noise. We derive the capacity region of this scheme by viewing the resulting channel as a multiple-access channel with many receivers, and relate it to the 'side information gain' -- which is a measure of the advantage of a code in utilizing receiver side information -- of the inner index code/modulation. We demonstrate the utility of the proposed architecture by simulating the performance of an index code/modulation concatenated with an off-the-shelf convolutional code through bit-interleaved coded-modulation.Comment: To appear in Proc. IEEE Int. Symp. Inf. Theory (ISIT) 2015, Hong Kong, Jun. 2015. 5 pages, 4 figure

    Optimal Coding Schemes for the Three-Receiver AWGN Broadcast Channel with Receiver Message Side Information

    Full text link
    This paper investigates the capacity region of the three-receiver AWGN broadcast channel where the receivers (i) have private-message requests and (ii) may know some of the messages requested by other receivers as side information. We first classify all 64 possible side information configurations into eight groups, each consisting of eight members. We next construct transmission schemes, and derive new inner and outer bounds for the groups. This establishes the capacity region for 52 out of 64 possible side information configurations. For six groups (i.e., groups 1, 2, 3, 5, 6, and 8 in our terminology), we establish the capacity region for all their members, and show that it tightens both the best known inner and outer bounds. For group 4, our inner and outer bounds tighten the best known inner bound and/or outer bound for all the group members. Moreover, our bounds coincide at certain regions, which can be characterized by two thresholds. For group 7, our inner and outer bounds coincide for four members, thereby establishing the capacity region. For the remaining four members, our bounds tighten both the best known inner and outer bounds.Comment: Authors' final version (to appear in IEEE Transactions on Information Theory
    corecore