6 research outputs found
Tree-Structure Expectation Propagation for LDPC Decoding over the BEC
We present the tree-structure expectation propagation (Tree-EP) algorithm to
decode low-density parity-check (LDPC) codes over discrete memoryless channels
(DMCs). EP generalizes belief propagation (BP) in two ways. First, it can be
used with any exponential family distribution over the cliques in the graph.
Second, it can impose additional constraints on the marginal distributions. We
use this second property to impose pair-wise marginal constraints over pairs of
variables connected to a check node of the LDPC code's Tanner graph. Thanks to
these additional constraints, the Tree-EP marginal estimates for each variable
in the graph are more accurate than those provided by BP. We also reformulate
the Tree-EP algorithm for the binary erasure channel (BEC) as a peeling-type
algorithm (TEP) and we show that the algorithm has the same computational
complexity as BP and it decodes a higher fraction of errors. We describe the
TEP decoding process by a set of differential equations that represents the
expected residual graph evolution as a function of the code parameters. The
solution of these equations is used to predict the TEP decoder performance in
both the asymptotic regime and the finite-length regime over the BEC. While the
asymptotic threshold of the TEP decoder is the same as the BP decoder for
regular and optimized codes, we propose a scaling law (SL) for finite-length
LDPC codes, which accurately approximates the TEP improved performance and
facilitates its optimization
Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)
During the last two decades, concentration inequalities have been the subject
of exciting developments in various areas, including convex geometry,
functional analysis, statistical physics, high-dimensional statistics, pure and
applied probability theory, information theory, theoretical computer science,
and learning theory. This monograph focuses on some of the key modern
mathematical tools that are used for the derivation of concentration
inequalities, on their links to information theory, and on their various
applications to communications and coding. In addition to being a survey, this
monograph also includes various new recent results derived by the authors. The
first part of the monograph introduces classical concentration inequalities for
martingales, as well as some recent refinements and extensions. The power and
versatility of the martingale approach is exemplified in the context of codes
defined on graphs and iterative decoding algorithms, as well as codes for
wireless communication. The second part of the monograph introduces the entropy
method, an information-theoretic technique for deriving concentration
inequalities. The basic ingredients of the entropy method are discussed first
in the context of logarithmic Sobolev inequalities, which underlie the
so-called functional approach to concentration of measure, and then from a
complementary information-theoretic viewpoint based on transportation-cost
inequalities and probability in metric spaces. Some representative results on
concentration for dependent random variables are briefly summarized, with
emphasis on their connections to the entropy method. Finally, we discuss
several applications of the entropy method to problems in communications and
coding, including strong converses, empirical distributions of good channel
codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol.
10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014.
ISBN to printed book: 978-1-60198-906-