35,613 research outputs found
On Pseudocodewords and Improved Union Bound of Linear Programming Decoding of HDPC Codes
In this paper, we present an improved union bound on the Linear Programming
(LP) decoding performance of the binary linear codes transmitted over an
additive white Gaussian noise channels. The bounding technique is based on the
second-order of Bonferroni-type inequality in probability theory, and it is
minimized by Prim's minimum spanning tree algorithm. The bound calculation
needs the fundamental cone generators of a given parity-check matrix rather
than only their weight spectrum, but involves relatively low computational
complexity. It is targeted to high-density parity-check codes, where the number
of their generators is extremely large and these generators are spread densely
in the Euclidean space. We explore the generator density and make a comparison
between different parity-check matrix representations. That density effects on
the improvement of the proposed bound over the conventional LP union bound. The
paper also presents a complete pseudo-weight distribution of the fundamental
cone generators for the BCH[31,21,5] code
Random Coding Error Exponents for the Two-User Interference Channel
This paper is about deriving lower bounds on the error exponents for the
two-user interference channel under the random coding regime for several
ensembles. Specifically, we first analyze the standard random coding ensemble,
where the codebooks are comprised of independently and identically distributed
(i.i.d.) codewords. For this ensemble, we focus on optimum decoding, which is
in contrast to other, suboptimal decoding rules that have been used in the
literature (e.g., joint typicality decoding, treating interference as noise,
etc.). The fact that the interfering signal is a codeword, rather than an
i.i.d. noise process, complicates the application of conventional techniques of
performance analysis of the optimum decoder. Also, unfortunately, these
conventional techniques result in loose bounds. Using analytical tools rooted
in statistical physics, as well as advanced union bounds, we derive
single-letter formulas for the random coding error exponents. We compare our
results with the best known lower bound on the error exponent, and show that
our exponents can be strictly better. Then, in the second part of this paper,
we consider more complicated coding ensembles, and find a lower bound on the
error exponent associated with the celebrated Han-Kobayashi (HK) random coding
ensemble, which is based on superposition coding.Comment: accepted IEEE Transactions on Information Theor
- …