1 research outputs found
Convexity and Operational Interpretation of the Quantum Information Bottleneck Function
In classical information theory, the information bottleneck method (IBM) can
be regarded as a method of lossy data compression which focusses on preserving
meaningful (or relevant) information. As such it has recently gained a lot of
attention, primarily for its applications in machine learning and neural
networks. A quantum analogue of the IBM has recently been defined, and an
attempt at providing an operational interpretation of the so-called quantum IB
function as an optimal rate of an information-theoretic task, has recently been
made by Salek et al. However, the interpretation given in that paper has a
couple of drawbacks; firstly its proof is based on a conjecture that the
quantum IB function is convex, and secondly, the expression for the rate
function involves certain entropic quantities which occur explicitly in the
very definition of the underlying information-theoretic task, thus making the
latter somewhat contrived. We overcome both of these drawbacks by first proving
the convexity of the quantum IB function, and then giving an alternative
operational interpretation of it as the optimal rate of a bona fide
information-theoretic task, namely that of quantum source coding with quantum
side information at the decoder, and relate the quantum IB function to the rate
region of this task. We similarly show that the related privacy funnel function
is convex (both in the classical and quantum case). However, we comment that it
is unlikely that the quantum privacy funnel function can characterize the
optimal asymptotic rate of an information theoretic task, since even its
classical version lacks a certain additivity property which turns out to be
essential.Comment: 17 pages, 7 figures; v2: improved presentation and explanations, one
new figure; v3: Restructured manuscript. Theorem 2 has been found previously
in work by Hsieh and Watanabe; it is now correctly attribute