9 research outputs found

    The Zero-Undetected-Error Capacity Approaches the Sperner Capacity

    Full text link
    Ahlswede, Cai, and Zhang proved that, in the noise-free limit, the zero-undetected-error capacity is lower bounded by the Sperner capacity of the channel graph, and they conjectured equality. Here we derive an upper bound that proves the conjecture.Comment: 8 Pages; added a section on the definition of Sperner capacity; accepted for publication in the IEEE Transactions on Information Theor

    A General Formula for the Mismatch Capacity

    Full text link
    The fundamental limits of channels with mismatched decoding are addressed. A general formula is established for the mismatch capacity of a general channel, defined as a sequence of conditional distributions with a general decoding metrics sequence. We deduce an identity between the Verd\'{u}-Han general channel capacity formula, and the mismatch capacity formula applied to Maximum Likelihood decoding metric. Further, several upper bounds on the capacity are provided, and a simpler expression for a lower bound is derived for the case of a non-negative decoding metric. The general formula is specialized to the case of finite input and output alphabet channels with a type-dependent metric. The closely related problem of threshold mismatched decoding is also studied, and a general expression for the threshold mismatch capacity is obtained. As an example of threshold mismatch capacity, we state a general expression for the erasures-only capacity of the finite input and output alphabet channel. We observe that for every channel there exists a (matched) threshold decoder which is capacity achieving. Additionally, necessary and sufficient conditions are stated for a channel to have a strong converse. Csisz\'{a}r and Narayan's conjecture is proved for bounded metrics, providing a positive answer to the open problem introduced in [1], i.e., that the "product-space" improvement of the lower random coding bound, Cq(∞)(W)C_q^{(\infty)}(W), is indeed the mismatch capacity of the discrete memoryless channel WW. We conclude by presenting an identity between the threshold capacity and Cq(∞)(W)C_q^{(\infty)}(W) in the DMC case

    Dilworth rate: a generalization of Witsenhausen's zero-error rate for directed graphs

    Get PDF

    Dilworth Rate: A Generalization of Witsenhausen’s Zero-Error Rate for Directed Graphs

    Full text link

    The Zero-Undetected-Error Capacity Approaches the Sperner Capacity

    No full text

    Information-Theoretic Foundations of Mismatched Decoding

    Full text link
    Shannon's channel coding theorem characterizes the maximal rate of information that can be reliably transmitted over a communication channel when optimal encoding and decoding strategies are used. In many scenarios, however, practical considerations such as channel uncertainty and implementation constraints rule out the use of an optimal decoder. The mismatched decoding problem addresses such scenarios by considering the case that the decoder cannot be optimized, but is instead fixed as part of the problem statement. This problem is not only of direct interest in its own right, but also has close connections with other long-standing theoretical problems in information theory. In this monograph, we survey both classical literature and recent developments on the mismatched decoding problem, with an emphasis on achievable random-coding rates for memoryless channels. We present two widely-considered achievable rates known as the generalized mutual information (GMI) and the LM rate, and overview their derivations and properties. In addition, we survey several improved rates via multi-user coding techniques, as well as recent developments and challenges in establishing upper bounds on the mismatch capacity, and an analogous mismatched encoding problem in rate-distortion theory. Throughout the monograph, we highlight a variety of applications and connections with other prominent information theory problems.Comment: Published in Foundations and Trends in Communications and Information Theory (Volume 17, Issue 2-3
    corecore