104 research outputs found

    On Constant Gaps for the Two-way Gaussian Interference Channel

    Full text link
    We introduce the two-way Gaussian interference channel in which there are four nodes with four independent messages: two-messages to be transmitted over a Gaussian interference channel in the \rightarrow direction, simultaneously with two-messages to be transmitted over an interference channel (in-band, full-duplex) in the \leftarrow direction. In such a two-way network, all nodes are transmitters and receivers of messages, allowing them to adapt current channel inputs to previously received channel outputs. We propose two new outer bounds on the symmetric sum-rate for the two-way Gaussian interference channel with complex channel gains: one under full adaptation (all 4 nodes are permitted to adapt inputs to previous outputs), and one under partial adaptation (only 2 nodes are permitted to adapt, the other 2 are restricted). We show that simple non-adaptive schemes such as the Han and Kobayashi scheme, where inputs are functions of messages only and not past outputs, utilized in each direction are sufficient to achieve within a constant gap of these fully or partially adaptive outer bounds for all channel regimes.Comment: presented at 50th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, October 201

    The adaptive zero-error capacity for a class of channels with noisy feedback

    Full text link
    The adaptive zero-error capacity of discrete memoryless channels (DMC) with noiseless feedback has been shown to be positive whenever there exists at least one channel output "disprover", i.e. a channel output that cannot be reached from at least one of the inputs. Furthermore, whenever there exists a disprover, the adaptive zero-error capacity attains the Shannon (small-error) capacity. Here, we study the zero-error capacity of a DMC when the channel feedback is noisy rather than perfect. We show that the adaptive zero-error capacity with noisy feedback is lower bounded by the forward channel's zero-undetected error capacity, and show that under certain conditions this is tight

    Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results

    Full text link
    The capacity of the Gaussian cognitive interference channel, a variation of the classical two-user interference channel where one of the transmitters (referred to as cognitive) has knowledge of both messages, is known in several parameter regimes but remains unknown in general. In this paper we provide a comparative overview of this channel model as we proceed through our contributions: we present a new outer bound based on the idea of a broadcast channel with degraded message sets, and another series of outer bounds obtained by transforming the cognitive channel into channels with known capacity. We specialize the largest known inner bound derived for the discrete memoryless channel to the Gaussian noise channel and present several simplified schemes evaluated for Gaussian inputs in closed form which we use to prove a number of results. These include a new set of capacity results for the a) "primary decodes cognitive" regime, a subset of the "strong interference" regime that is not included in the "very strong interference" regime for which capacity was known, and for the b) "S-channel" in which the primary transmitter does not interfere with the cognitive receiver. Next, for a general Gaussian cognitive interference channel, we determine the capacity to within one bit/s/Hz and to within a factor two regardless of channel parameters, thus establishing rate performance guarantees at high and low SNR, respectively. We also show how different simplified transmission schemes achieve a constant gap between inner and outer bound for specific channels. Finally, we numerically evaluate and compare the various simplified achievable rate regions and outer bounds in parameter regimes where capacity is unknown, leading to further insight on the capacity region of the Gaussian cognitive interference channel.Comment: submitted to IEEE transaction of Information Theor

    A New Capacity Result for the Z-Gaussian Cognitive Interference Channel

    Full text link
    This work proposes a novel outer bound for the Gaussian cognitive interference channel in strong interference at the primary receiver based on the capacity of a multi-antenna broadcast channel with degraded message set. It then shows that for the Z-channel, i.e., when the secondary receiver experiences no interference and the primary receiver experiences strong interference, the proposed outer bound not only is the tightest among known bounds but is actually achievable for sufficiently strong interference. The latter is a novel capacity result that from numerical evaluations appears to be generalizable to a larger (i.e., non-Z) class of Gaussian channels

    On Discrete Alphabets for the Two-user Gaussian Interference Channel with One Receiver Lacking Knowledge of the Interfering Codebook

    Full text link
    In multi-user information theory it is often assumed that every node in the network possesses all codebooks used in the network. This assumption is however impractical in distributed ad-hoc and cognitive networks. This work considers the two- user Gaussian Interference Channel with one Oblivious Receiver (G-IC-OR), i.e., one receiver lacks knowledge of the interfering cookbook while the other receiver knows both codebooks. We ask whether, and if so how much, the channel capacity of the G-IC- OR is reduced compared to that of the classical G-IC where both receivers know all codebooks. Intuitively, the oblivious receiver should not be able to jointly decode its intended message along with the unintended interfering message whose codebook is unavailable. We demonstrate that in strong and very strong interference, where joint decoding is capacity achieving for the classical G-IC, lack of codebook knowledge does not reduce performance in terms of generalized degrees of freedom (gDoF). Moreover, we show that the sum-capacity of the symmetric G-IC- OR is to within O(log(log(SNR))) of that of the classical G-IC. The key novelty of the proposed achievable scheme is the use of a discrete input alphabet for the non-oblivious transmitter, whose cardinality is appropriately chosen as a function of SNR

    On Identifying a Massive Number of Distributions

    Full text link
    Finding the underlying probability distributions of a set of observed sequences under the constraint that each sequence is generated i.i.d by a distinct distribution is considered. The number of distributions, and hence the number of observed sequences, are let to grow with the observation blocklength nn. Asymptotically matching upper and lower bounds on the probability of error are derived.Comment: Under Submissio
    corecore