It is well known that for Gaussian channels, a nearest neighbor decoding
rule, which seeks the minimum Euclidean distance between a codeword and the
received channel output vector, is the maximum likelihood solution and hence
capacity-achieving. Nearest neighbor decoding remains a convenient and yet
mismatched solution for general channels, and the key message of this paper is
that the performance of the nearest neighbor decoding can be improved by
generalizing its decoding metric to incorporate channel state dependent output
processing and codeword scaling. Using generalized mutual information, which is
a lower bound to the mismatched capacity under independent and identically
distributed codebook ensemble, as the performance measure, this paper
establishes the optimal generalized nearest neighbor decoding rule, under
Gaussian channel input. Several {restricted forms of the} generalized nearest
neighbor decoding rule are also derived and compared with existing solutions.
The results are illustrated through several case studies for fading channels
with imperfect receiver channel state information and for channels with
quantization effects.Comment: 30 pages, 8 figure