4 research outputs found
Bhattacharyya parameter of monomials codes for the Binary Erasure Channel: from pointwise to average reliability
Monomial codes were recently equipped with partial order relations, fact that
allowed researchers to discover structural properties and efficient algorithm
for constructing polar codes. Here, we refine the existing order relations in
the particular case of Binary Erasure Channel. The new order relation takes us
closer to the ultimate order relation induced by the pointwise evaluation of
the Bhattacharyya parameter of the synthetic channels. The best we can hope for
is still a partial order relation. To overcome this issue we appeal to related
technique from network theory. Reliability network theory was recently used in
the context of polar coding and more generally in connection with decreasing
monomial codes. In this article, we investigate how the concept of average
reliability is applied for polar codes designed for the binary erasure channel.
Instead of minimizing the error probability of the synthetic channels, for a
particular value of the erasure parameter p, our codes minimize the average
error probability of the synthetic channels. By means of basic network theory
results we determine a closed formula for the average reliability of a
particular synthetic channel, that recently gain the attention of researchers.Comment: 21 pages, 5 figures, 3 tables. Submitted for possible publicatio
Information-Theoretic Foundations of Mismatched Decoding
Shannon's channel coding theorem characterizes the maximal rate of
information that can be reliably transmitted over a communication channel when
optimal encoding and decoding strategies are used. In many scenarios, however,
practical considerations such as channel uncertainty and implementation
constraints rule out the use of an optimal decoder. The mismatched decoding
problem addresses such scenarios by considering the case that the decoder
cannot be optimized, but is instead fixed as part of the problem statement.
This problem is not only of direct interest in its own right, but also has
close connections with other long-standing theoretical problems in information
theory. In this monograph, we survey both classical literature and recent
developments on the mismatched decoding problem, with an emphasis on achievable
random-coding rates for memoryless channels. We present two widely-considered
achievable rates known as the generalized mutual information (GMI) and the LM
rate, and overview their derivations and properties. In addition, we survey
several improved rates via multi-user coding techniques, as well as recent
developments and challenges in establishing upper bounds on the mismatch
capacity, and an analogous mismatched encoding problem in rate-distortion
theory. Throughout the monograph, we highlight a variety of applications and
connections with other prominent information theory problems.Comment: Published in Foundations and Trends in Communications and Information
Theory (Volume 17, Issue 2-3
Recommended from our members
Information-Theoretic Foundations of Mismatched Decoding
Shannon’s channel coding theorem characterizes the maximal rate of information that can be reliably transmitted over a communication channel when optimal encoding and decoding strategies are used. In many scenarios, however, practical considerations such as channel uncertainty and implementation constraints rule out the use of an optimal decoder. The mismatched decoding problem addresses such scenarios by considering the case that the decoder cannot be optimized, but is instead fixed as part of the problem statement. This problem is not only of direct interest in its own right, but also has close connections with other long-standing theoretical problems in information theory.
In this monograph, we survey both classical literature and recent developments on the mismatched decoding problem, with an emphasis on achievable random-coding rates for memoryless channels. We present two widely-considered achievable rates known as the generalized mutual information (GMI) and the LM rate, and overview their derivations and properties. In addition, we survey several improved rates via multi-user coding techniques, as well as recent developments and challenges in establishing upper bounds on the mismatch capacity, and an analogous mismatched encoding problem in rate-distortion theory. Throughout the monograph, we highlight a variety of applications and connections with other prominent information theory problems