research

Lossy joint source-channel coding in the finite blocklength regime

Abstract

This paper finds new tight finite-blocklength bounds for the best achievable lossy joint source-channel code rate, and demonstrates that joint source-channel code design brings considerable performance advantage over a separate one in the non-asymptotic regime. A joint source-channel code maps a block of kk source symbols onto a lengthn-n channel codeword, and the fidelity of reproduction at the receiver end is measured by the probability ϵ\epsilon that the distortion exceeds a given threshold dd. For memoryless sources and channels, it is demonstrated that the parameters of the best joint source-channel code must satisfy nCkR(d)nV+kV(d)Q(ϵ)nC - kR(d) \approx \sqrt{nV + k \mathcal V(d)} Q(\epsilon), where CC and VV are the channel capacity and channel dispersion, respectively; R(d)R(d) and V(d)\mathcal V(d) are the source rate-distortion and rate-dispersion functions; and QQ is the standard Gaussian complementary cdf. Symbol-by-symbol (uncoded) transmission is known to achieve the Shannon limit when the source and channel satisfy a certain probabilistic matching condition. In this paper we show that even when this condition is not satisfied, symbol-by-symbol transmission is, in some cases, the best known strategy in the non-asymptotic regime

    Similar works