Recently, a new decoding rule called jar decoding was proposed; under jar
decoding, a non-asymptotic achievable tradeoff between the coding rate and word
error probability was also established for any discrete input memoryless
channel with discrete or continuous output (DIMC). Along the path of
non-asymptotic analysis, in this paper, it is further shown that jar decoding
is actually optimal up to the second order coding performance by establishing
new non-asymptotic converse coding theorems, and determining the Taylor
expansion of the (best) coding rate Rn​(ϵ) of finite block length for
any block length n and word error probability ϵ up to the second
order. Finally, based on the Taylor-type expansion and the new converses, two
approximation formulas for Rn​(ϵ) (dubbed "SO" and "NEP") are
provided; they are further evaluated and compared against some of the best
bounds known so far, as well as the normal approximation of Rn​(ϵ)
revisited recently in the literature. It turns out that while the normal
approximation is all over the map, i.e. sometime below achievable bounds and
sometime above converse bounds, the SO approximation is much more reliable as
it is always below converses; in the meantime, the NEP approximation is the
best among the three and always provides an accurate estimation for Rn​(ϵ). An important implication arising from the Taylor-type expansion of
Rn​(ϵ) is that in the practical non-asymptotic regime, the optimal
marginal codeword symbol distribution is not necessarily a capacity achieving
distribution.Comment: submitted to IEEE Transaction on Information Theory in April, 201