1,764 research outputs found
On privacy amplification, lossy compression, and their duality to channel coding
We examine the task of privacy amplification from information-theoretic and
coding-theoretic points of view. In the former, we give a one-shot
characterization of the optimal rate of privacy amplification against classical
adversaries in terms of the optimal type-II error in asymmetric hypothesis
testing. This formulation can be easily computed to give finite-blocklength
bounds and turns out to be equivalent to smooth min-entropy bounds by Renner
and Wolf [Asiacrypt 2005] and Watanabe and Hayashi [ISIT 2013], as well as a
bound in terms of the divergence by Yang, Schaefer, and Poor
[arXiv:1706.03866 [cs.IT]]. In the latter, we show that protocols for privacy
amplification based on linear codes can be easily repurposed for channel
simulation. Combined with known relations between channel simulation and lossy
source coding, this implies that privacy amplification can be understood as a
basic primitive for both channel simulation and lossy compression. Applied to
symmetric channels or lossy compression settings, our construction leads to
proto- cols of optimal rate in the asymptotic i.i.d. limit. Finally, appealing
to the notion of channel duality recently detailed by us in [IEEE Trans. Info.
Theory 64, 577 (2018)], we show that linear error-correcting codes for
symmetric channels with quantum output can be transformed into linear lossy
source coding schemes for classical variables arising from the dual channel.
This explains a "curious duality" in these problems for the (self-dual) erasure
channel observed by Martinian and Yedidia [Allerton 2003; arXiv:cs/0408008] and
partly anticipates recent results on optimal lossy compression by polar and
low-density generator matrix codes.Comment: v3: updated to include equivalence of the converse bound with smooth
entropy formulations. v2: updated to include comparison with the one-shot
bounds of arXiv:1706.03866. v1: 11 pages, 4 figure
Sharp Bounds for Optimal Decoding of Low Density Parity Check Codes
Consider communication over a binary-input memoryless output-symmetric
channel with low density parity check (LDPC) codes and maximum a posteriori
(MAP) decoding. The replica method of spin glass theory allows to conjecture an
analytic formula for the average input-output conditional entropy per bit in
the infinite block length limit. Montanari proved a lower bound for this
entropy, in the case of LDPC ensembles with convex check degree polynomial,
which matches the replica formula. Here we extend this lower bound to any
irregular LDPC ensemble. The new feature of our work is an analysis of the
second derivative of the conditional input-output entropy with respect to
noise. A close relation arises between this second derivative and correlation
or mutual information of codebits. This allows us to extend the realm of the
interpolation method, in particular we show how channel symmetry allows to
control the fluctuations of the overlap parameters.Comment: 40 Pages, Submitted to IEEE Transactions on Information Theor
- …