40,877 research outputs found
Scaling Exponent and Moderate Deviations Asymptotics of Polar Codes for the AWGN Channel
This paper investigates polar codes for the additive white Gaussian noise
(AWGN) channel. The scaling exponent of polar codes for a memoryless
channel with capacity characterizes the closest gap
between the capacity and non-asymptotic achievable rates in the following way:
For a fixed , the gap between the capacity
and the maximum non-asymptotic rate achieved by a length- polar code
with average error probability scales as , i.e.,
.
It is well known that the scaling exponent for any binary-input
memoryless channel (BMC) with is bounded above by ,
which was shown by an explicit construction of polar codes. Our main result
shows that remains to be a valid upper bound on the scaling exponent
for the AWGN channel. Our proof technique involves the following two ideas: (i)
The capacity of the AWGN channel can be achieved within a gap of
by using an input alphabet consisting of
constellations and restricting the input distribution to be uniform; (ii) The
capacity of a multiple access channel (MAC) with an input alphabet consisting
of constellations can be achieved within a gap of by
using a superposition of binary-input polar codes. In addition, we
investigate the performance of polar codes in the moderate deviations regime
where both the gap to capacity and the error probability vanish as grows.
An explicit construction of polar codes is proposed to obey a certain tradeoff
between the gap to capacity and the decay rate of the error probability for the
AWGN channel.Comment: 24 page
Strong Converse Theorems for Classes of Multimessage Multicast Networks: A R\'enyi Divergence Approach
This paper establishes that the strong converse holds for some classes of
discrete memoryless multimessage multicast networks (DM-MMNs) whose
corresponding cut-set bounds are tight, i.e., coincide with the set of
achievable rate tuples. The strong converse for these classes of DM-MMNs
implies that all sequences of codes with rate tuples belonging to the exterior
of the cut-set bound have average error probabilities that necessarily tend to
one (and are not simply bounded away from zero). Examples in the classes of
DM-MMNs include wireless erasure networks, DM-MMNs consisting of independent
discrete memoryless channels (DMCs) as well as single-destination DM-MMNs
consisting of independent DMCs with destination feedback. Our elementary proof
technique leverages properties of the R\'enyi divergence.Comment: Submitted to IEEE Transactions on Information Theory, Jul 18, 2014.
Revised on Jul 31, 201
- …