16 research outputs found
On contraction coefficients, partial orders and approximation of capacities for quantum channels
The data processing inequality is the most basic requirement for any
meaningful measure of information. It essentially states that
distinguishability measures between states decrease if we apply a quantum
channel. It is the centerpiece of many results in information theory and
justifies the operational interpretation of most entropic quantities. In this
work, we revisit the notion of contraction coefficients of quantum channels,
which provide sharper and specialized versions of the data processing
inequality. A concept closely related to data processing are partial orders on
quantum channels. We discuss several quantum extensions of the well known less
noisy ordering and then relate them to contraction coefficients. We further
define approximate versions of the partial orders and show how they can give
strengthened and conceptually simple proofs of several results on approximating
capacities. Moreover, we investigate the relation to other partial orders in
the literature and their properties, particularly with regards to
tensorization. We then investigate further properties of contraction
coefficients and their relation to other properties of quantum channels, such
as hypercontractivity. Next, we extend the framework of contraction
coefficients to general f-divergences and prove several structural results.
Finally, we consider two important classes of quantum channels, namely
Weyl-covariant and bosonic Gaussian channels. For those, we determine new
contraction coefficients and relations for various partial orders.Comment: 47 pages, 2 figure
A Note on the Probability of Rectangles for Correlated Binary Strings
Consider two sequences of independent and identically distributed fair
coin tosses, and , which are
-correlated for each , i.e. .
We study the question of how large (small) the probability can be among all sets of a given cardinality.
For sets it is well known that the largest (smallest)
probability is approximately attained by concentric (anti-concentric) Hamming
balls, and this can be proved via the hypercontractive inequality (reverse
hypercontractivity). Here we consider the case of . By
applying a recent extension of the hypercontractive inequality of
Polyanskiy-Samorodnitsky (J. Functional Analysis, 2019), we show that Hamming
balls of the same size approximately maximize in
the regime of . We also prove a similar tight lower bound, i.e.
show that for the pair of opposite Hamming balls approximately
minimizes the probability
Information Extraction Under Privacy Constraints
A privacy-constrained information extraction problem is considered where for
a pair of correlated discrete random variables governed by a given
joint distribution, an agent observes and wants to convey to a potentially
public user as much information about as possible without compromising the
amount of information revealed about . To this end, the so-called {\em
rate-privacy function} is introduced to quantify the maximal amount of
information (measured in terms of mutual information) that can be extracted
from under a privacy constraint between and the extracted information,
where privacy is measured using either mutual information or maximal
correlation. Properties of the rate-privacy function are analyzed and
information-theoretic and estimation-theoretic interpretations of it are
presented for both the mutual information and maximal correlation privacy
measures. It is also shown that the rate-privacy function admits a closed-form
expression for a large family of joint distributions of . Finally, the
rate-privacy function under the mutual information privacy measure is
considered for the case where has a joint probability density function
by studying the problem where the extracted information is a uniform
quantization of corrupted by additive Gaussian noise. The asymptotic
behavior of the rate-privacy function is studied as the quantization resolution
grows without bound and it is observed that not all of the properties of the
rate-privacy function carry over from the discrete to the continuous case.Comment: 55 pages, 6 figures. Improved the organization and added detailed
literature revie
Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)
During the last two decades, concentration inequalities have been the subject
of exciting developments in various areas, including convex geometry,
functional analysis, statistical physics, high-dimensional statistics, pure and
applied probability theory, information theory, theoretical computer science,
and learning theory. This monograph focuses on some of the key modern
mathematical tools that are used for the derivation of concentration
inequalities, on their links to information theory, and on their various
applications to communications and coding. In addition to being a survey, this
monograph also includes various new recent results derived by the authors. The
first part of the monograph introduces classical concentration inequalities for
martingales, as well as some recent refinements and extensions. The power and
versatility of the martingale approach is exemplified in the context of codes
defined on graphs and iterative decoding algorithms, as well as codes for
wireless communication. The second part of the monograph introduces the entropy
method, an information-theoretic technique for deriving concentration
inequalities. The basic ingredients of the entropy method are discussed first
in the context of logarithmic Sobolev inequalities, which underlie the
so-called functional approach to concentration of measure, and then from a
complementary information-theoretic viewpoint based on transportation-cost
inequalities and probability in metric spaces. Some representative results on
concentration for dependent random variables are briefly summarized, with
emphasis on their connections to the entropy method. Finally, we discuss
several applications of the entropy method to problems in communications and
coding, including strong converses, empirical distributions of good channel
codes, and an information-theoretic converse for concentration of measure.Comment: Foundations and Trends in Communications and Information Theory, vol.
10, no 1-2, pp. 1-248, 2013. Second edition was published in October 2014.
ISBN to printed book: 978-1-60198-906-