2,657 research outputs found

    Two remarks to noiseless coding

    Get PDF
    An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the possibility of encoding symbols of the alphabet of a discrete memoryless source of entropy H by sequences of symbols of another alphabet of size D in such a way that the average code length be close to the optimum H/log D. The same idea is applied to the problem of maximizing entropy per second for unequal symbol lenghts, too

    Multiaccess Channels with State Known to Some Encoders and Independent Messages

    Full text link
    We consider a state-dependent multiaccess channel (MAC) with state non-causally known to some encoders. We derive an inner bound for the capacity region in the general discrete memoryless case and specialize to a binary noiseless case. In the case of maximum entropy channel state, we obtain the capacity region for binary noiseless MAC with one informed encoder by deriving a non-trivial outer bound for this case. For a Gaussian state-dependent MAC with one encoder being informed of the channel state, we present an inner bound by applying a slightly generalized dirty paper coding (GDPC) at the informed encoder that allows for partial state cancellation, and a trivial outer bound by providing channel state to the decoder also. The uninformed encoders benefit from the state cancellation in terms of achievable rates, however, appears that GDPC cannot completely eliminate the effect of the channel state on the achievable rate region, in contrast to the case of all encoders being informed. In the case of infinite state variance, we analyze how the uninformed encoder benefits from the informed encoder's actions using the inner bound and also provide a non-trivial outer bound for this case which is better than the trivial outer bound.Comment: Accepted to EURASIP Journal on Wireless Communication and Networking, Feb. 200

    Lecture Notes on Network Information Theory

    Full text link
    These lecture notes have been converted to a book titled Network Information Theory published recently by Cambridge University Press. This book provides a significantly expanded exposition of the material in the lecture notes as well as problems and bibliographic notes at the end of each chapter. The authors are currently preparing a set of slides based on the book that will be posted in the second half of 2012. More information about the book can be found at http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/

    Broadcast Channels with Cooperating Decoders

    Full text link
    We consider the problem of communicating over the general discrete memoryless broadcast channel (BC) with partially cooperating receivers. In our setup, receivers are able to exchange messages over noiseless conference links of finite capacities, prior to decoding the messages sent from the transmitter. In this paper we formulate the general problem of broadcast with cooperation. We first find the capacity region for the case where the BC is physically degraded. Then, we give achievability results for the general broadcast channel, for both the two independent messages case and the single common message case.Comment: Final version, to appear in the IEEE Transactions on Information Theory -- contains (very) minor changes based on the last round of review

    Capacity of a Class of Deterministic Relay Channels

    Full text link
    The capacity of a class of deterministic relay channels with the transmitter input X, the receiver output Y, the relay output Y_1 = f(X, Y), and a separate communication link from the relay to the receiver with capacity R_0, is shown to be C(R_0) = \max_{p(x)} \min \{I(X;Y)+R_0, I(X;Y, Y_1) \}. Thus every bit from the relay is worth exactly one bit to the receiver. Two alternative coding schemes are presented that achieve this capacity. The first scheme, ``hash-and-forward'', is based on a simple yet novel use of random binning on the space of relay outputs, while the second scheme uses the usual ``compress-and-forward''. In fact, these two schemes can be combined together to give a class of optimal coding schemes. As a corollary, this relay capacity result confirms a conjecture by Ahlswede and Han on the capacity of a channel with rate-limited state information at the decoder in the special case when the channel state is recoverable from the channel input and the output.Comment: 17 pages, submitted to IEEE Transactions on Information Theor

    Monte Carlo Algorithms for the Partition Function and Information Rates of Two-Dimensional Channels

    Full text link
    The paper proposes Monte Carlo algorithms for the computation of the information rate of two-dimensional source/channel models. The focus of the paper is on binary-input channels with constraints on the allowed input configurations. The problem of numerically computing the information rate, and even the noiseless capacity, of such channels has so far remained largely unsolved. Both problems can be reduced to computing a Monte Carlo estimate of a partition function. The proposed algorithms use tree-based Gibbs sampling and multilayer (multitemperature) importance sampling. The viability of the proposed algorithms is demonstrated by simulation results
    • …
    corecore