7,631 research outputs found

    A General Formula for the Mismatch Capacity

    Full text link
    The fundamental limits of channels with mismatched decoding are addressed. A general formula is established for the mismatch capacity of a general channel, defined as a sequence of conditional distributions with a general decoding metrics sequence. We deduce an identity between the Verd\'{u}-Han general channel capacity formula, and the mismatch capacity formula applied to Maximum Likelihood decoding metric. Further, several upper bounds on the capacity are provided, and a simpler expression for a lower bound is derived for the case of a non-negative decoding metric. The general formula is specialized to the case of finite input and output alphabet channels with a type-dependent metric. The closely related problem of threshold mismatched decoding is also studied, and a general expression for the threshold mismatch capacity is obtained. As an example of threshold mismatch capacity, we state a general expression for the erasures-only capacity of the finite input and output alphabet channel. We observe that for every channel there exists a (matched) threshold decoder which is capacity achieving. Additionally, necessary and sufficient conditions are stated for a channel to have a strong converse. Csisz\'{a}r and Narayan's conjecture is proved for bounded metrics, providing a positive answer to the open problem introduced in [1], i.e., that the "product-space" improvement of the lower random coding bound, Cq(∞)(W)C_q^{(\infty)}(W), is indeed the mismatch capacity of the discrete memoryless channel WW. We conclude by presenting an identity between the threshold capacity and Cq(∞)(W)C_q^{(\infty)}(W) in the DMC case

    Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

    Full text link
    This monograph presents a unified treatment of single- and multi-user problems in Shannon's information theory where we depart from the requirement that the error probability decays asymptotically in the blocklength. Instead, the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklengths. This represents the study of asymptotic estimates with non-vanishing error probabilities. In Part I, after reviewing the fundamentals of information theory, we discuss Strassen's seminal result for binary hypothesis testing where the type-I error probability is non-vanishing and the rate of decay of the type-II error probability with growing number of independent observations is characterized. In Part II, we use this basic hypothesis testing result to develop second- and sometimes, even third-order asymptotic expansions for point-to-point communication. Finally in Part III, we consider network information theory problems for which the second-order asymptotics are known. These problems include some classes of channels with random state, the multiple-encoder distributed lossless source coding (Slepian-Wolf) problem and special cases of the Gaussian interference and multiple-access channels. Finally, we discuss avenues for further research.Comment: Further comments welcom

    Collective Particle Flow through Random Media

    Full text link
    A simple model for the nonlinear collective transport of interacting particles in a random medium with strong disorder is introduced and analyzed. A finite threshold for the driving force divides the behavior into two regimes characterized by the presence or absence of a steady-state particle current. Below this threshold, transient motion is found in response to an increase in the force, while above threshold the flow approaches a steady state with motion only on a network of channels which is sparse near threshold. Some of the critical behavior near threshold is analyzed via mean field theory, and analytic results on the statistics of the moving phase are derived. Many of the results should apply, at least qualitatively, to the motion of magnetic bubble arrays and to the driven motion of vortices in thin film superconductors when the randomness is strong enough to destroy the tendencies to lattice order even on short length scales. Various history dependent phenomena are also discussed.Comment: 63 preprint pages plus 6 figures. Submitted to Phys Rev

    Bits About the Channel: Multi-round Protocols for Two-way Fading Channels

    Full text link
    Most communication systems use some form of feedback, often related to channel state information. In this paper, we study diversity multiplexing tradeoff for both FDD and TDD systems, when both receiver and transmitter knowledge about the channel is noisy and potentially mismatched. For FDD systems, we first extend the achievable tradeoff region for 1.5 rounds of message passing to get higher diversity compared to the best known scheme, in the regime of higher multiplexing gains. We then break the mold of all current channel state based protocols by using multiple rounds of conferencing to extract more bits about the actual channel. This iterative refinement of the channel increases the diversity order with every round of communication. The protocols are on-demand in nature, using high powers for training and feedback only when the channel is in poor states. The key result is that the diversity multiplexing tradeoff with perfect training and K levels of perfect feedback can be achieved, even when there are errors in training the receiver and errors in the feedback link, with a multi-round protocol which has K rounds of training and K-1 rounds of binary feedback. The above result can be viewed as a generalization of Zheng and Tse, and Aggarwal and Sabharwal, where the result was shown to hold for K=1 and K=2 respectively. For TDD systems, we also develop new achievable strategies with multiple rounds of communication between the transmitter and the receiver, which use the reciprocity of the forward and the feedback channel. The multi-round TDD protocol achieves a diversity-multiplexing tradeoff which uniformly dominates its FDD counterparts, where no channel reciprocity is available.Comment: Submitted to IEEE Transactions on Information Theor

    The Sender-Excited Secret Key Agreement Model: Capacity, Reliability and Secrecy Exponents

    Full text link
    We consider the secret key generation problem when sources are randomly excited by the sender and there is a noiseless public discussion channel. Our setting is thus similar to recent works on channels with action-dependent states where the channel state may be influenced by some of the parties involved. We derive single-letter expressions for the secret key capacity through a type of source emulation analysis. We also derive lower bounds on the achievable reliability and secrecy exponents, i.e., the exponential rates of decay of the probability of decoding error and of the information leakage. These exponents allow us to determine a set of strongly-achievable secret key rates. For degraded eavesdroppers the maximum strongly-achievable rate equals the secret key capacity; our exponents can also be specialized to previously known results. In deriving our strong achievability results we introduce a coding scheme that combines wiretap coding (to excite the channel) and key extraction (to distill keys from residual randomness). The secret key capacity is naturally seen to be a combination of both source- and channel-type randomness. Through examples we illustrate a fundamental interplay between the portion of the secret key rate due to each type of randomness. We also illustrate inherent tradeoffs between the achievable reliability and secrecy exponents. Our new scheme also naturally accommodates rate limits on the public discussion. We show that under rate constraints we are able to achieve larger rates than those that can be attained through a pure source emulation strategy.Comment: 18 pages, 8 figures; Submitted to the IEEE Transactions on Information Theory; Revised in Oct 201
    • …
    corecore