3 research outputs found
A Lower Bound on the Expected Distortion of Joint Source-Channel Coding
We consider the classic joint source-channel coding problem of transmitting a
memoryless source over a memoryless channel. The focus of this work is on the
long-standing open problem of finding the rate of convergence of the smallest
attainable expected distortion to its asymptotic value, as a function of
blocklength . Our main result is that in general the convergence rate is not
faster than . In particular, we show that for the problem of
transmitting i.i.d uniform bits over a binary symmetric channels with Hamming
distortion, the smallest attainable distortion (bit error rate) is at least
above the asymptotic value, if the ``bandwidth expansion
ratio'' is above
A Source-Channel Separation Theorem with Application to the Source Broadcast Problem
A converse method is developed for the source broadcast problem.
Specifically, it is shown that the separation architecture is optimal for a
variant of the source broadcast problem and the associated source-channel
separation theorem can be leveraged, via a reduction argument, to establish a
necessary condition for the original problem, which unifies several existing
results in the literature. Somewhat surprisingly, this method, albeit based on
the source-channel separation theorem, can be used to prove the optimality of
non-separation based schemes and determine the performance limits in certain
scenarios where the separation architecture is suboptimal.Comment: 10 page
On Low Density Majority Codes
We study a problem of constructing codes that transform a channel with high
bit error rate (BER) into one with low BER (at the expense of rate). Our focus
is on obtaining codes with smooth ("graceful") input-output BER curves (as
opposed to threshold-like curves typical for long error-correcting codes). To
that end we introduce the notion of Low Density Majority Codes (LDMCs). These
codes are non-linear sparse-graph codes, which output majority function
evaluated on randomly chosen small subsets of the data bits. This is similar to
Low Density Generator Matrix codes (LDGMs), except that the XOR function is
replaced with the majority. We show that even with a few iterations of belief
propagation (BP) the attained input-output curves provably improve upon
performance of any linear systematic code. The effect of non-linearity
bootstraping the initial iterations of BP, suggests that LDMCs should improve
performance in various applications, where LDGMs have been used traditionally
(e.g., pre-coding for optics, tornado raptor codes, protograph constructions).
As a side result of separate interest we establish a lower (impossibility)
bound for the achievable BER of a systematic linear code at one value of
erasure noise given its BER at another value. We show that this new bound is
superior to the results inferred from the area theorem for EXIT functions.Comment: This article has been replaced by the updated version
arXiv:1911.1226