research

The Dispersion of Nearest-Neighbor Decoding for Additive Non-Gaussian Channels

Abstract

We study the second-order asymptotics of information transmission using random Gaussian codebooks and nearest neighbor (NN) decoding over a power-limited stationary memoryless additive non-Gaussian noise channel. We show that the dispersion term depends on the non-Gaussian noise only through its second and fourth moments, thus complementing the capacity result (Lapidoth, 1996), which depends only on the second moment. Furthermore, we characterize the second-order asymptotics of point-to-point codes over KK-sender interference networks with non-Gaussian additive noise. Specifically, we assume that each user's codebook is Gaussian and that NN decoding is employed, i.e., that interference from the K1K-1 unintended users (Gaussian interfering signals) is treated as noise at each decoder. We show that while the first-order term in the asymptotic expansion of the maximum number of messages depends on the power of the interferring codewords only through their sum, this does not hold for the second-order term.Comment: 12 pages, 3 figures, IEEE Transactions on Information Theor

    Similar works