99,396 research outputs found
A Tight Upper Bound on Mutual Information
We derive a tight lower bound on equivocation (conditional entropy), or
equivalently a tight upper bound on mutual information between a signal
variable and channel outputs. The bound is in terms of the joint distribution
of the signals and maximum a posteriori decodes (most probable signals given
channel output). As part of our derivation, we describe the key properties of
the distribution of signals, channel outputs and decodes, that minimizes
equivocation and maximizes mutual information. This work addresses a problem in
data analysis, where mutual information between signals and decodes is
sometimes used to lower bound the mutual information between signals and
channel outputs. Our result provides a corresponding upper bound.Comment: 6 pages, 3 figures; proof illustration adde
Correlation in Hard Distributions in Communication Complexity
We study the effect that the amount of correlation in a bipartite
distribution has on the communication complexity of a problem under that
distribution. We introduce a new family of complexity measures that
interpolates between the two previously studied extreme cases: the (standard)
randomised communication complexity and the case of distributional complexity
under product distributions.
We give a tight characterisation of the randomised complexity of Disjointness
under distributions with mutual information , showing that it is
for all . This smoothly interpolates
between the lower bounds of Babai, Frankl and Simon for the product
distribution case (), and the bound of Razborov for the randomised case.
The upper bounds improve and generalise what was known for product
distributions, and imply that any tight bound for Disjointness needs
bits of mutual information in the corresponding distribution.
We study the same question in the distributional quantum setting, and show a
lower bound of , and an upper bound, matching up to a
logarithmic factor.
We show that there are total Boolean functions on inputs that have
distributional communication complexity under all distributions of
information up to , while the (interactive) distributional complexity
maximised over all distributions is for .
We show that in the setting of one-way communication under product
distributions, the dependence of communication cost on the allowed error
is multiplicative in -- the previous upper bounds
had the dependence of more than
On the Capacity of the Wiener Phase-Noise Channel: Bounds and Capacity Achieving Distributions
In this paper, the capacity of the additive white Gaussian noise (AWGN)
channel, affected by time-varying Wiener phase noise is investigated. Tight
upper and lower bounds on the capacity of this channel are developed. The upper
bound is obtained by using the duality approach, and considering a specific
distribution over the output of the channel. In order to lower-bound the
capacity, first a family of capacity-achieving input distributions is found by
solving a functional optimization of the channel mutual information. Then,
lower bounds on the capacity are obtained by drawing samples from the proposed
distributions through Monte-Carlo simulations. The proposed capacity-achieving
input distributions are circularly symmetric, non-Gaussian, and the input
amplitudes are correlated over time. The evaluated capacity bounds are tight
for a wide range of signal-to-noise-ratio (SNR) values, and thus they can be
used to quantify the capacity. Specifically, the bounds follow the well-known
AWGN capacity curve at low SNR, while at high SNR, they coincide with the
high-SNR capacity result available in the literature for the phase-noise
channel.Comment: IEEE Transactions on Communications, 201
Demystifying Fixed k-Nearest Neighbor Information Estimators
Estimating mutual information from i.i.d. samples drawn from an unknown joint
density function is a basic statistical problem of broad interest with
multitudinous applications. The most popular estimator is one proposed by
Kraskov and St\"ogbauer and Grassberger (KSG) in 2004, and is nonparametric and
based on the distances of each sample to its nearest neighboring
sample, where is a fixed small integer. Despite its widespread use (part of
scientific software packages), theoretical properties of this estimator have
been largely unexplored. In this paper we demonstrate that the estimator is
consistent and also identify an upper bound on the rate of convergence of the
bias as a function of number of samples. We argue that the superior performance
benefits of the KSG estimator stems from a curious "correlation boosting"
effect and build on this intuition to modify the KSG estimator in novel ways to
construct a superior estimator. As a byproduct of our investigations, we obtain
nearly tight rates of convergence of the error of the well known fixed
nearest neighbor estimator of differential entropy by Kozachenko and
Leonenko.Comment: 55 pages, 8 figure
Best Information is Most Successful
Using information-theoretic tools, this paper establishes a mathematical link between the probability of success of a side-channel attack and the minimum number of queries to reach a given success rate, valid for any possible distinguishing rule and with the best possible knowledge on the attacker\u27s side. This link is a lower bound on the number of queries highly depends on Shannon\u27s mutual information between the traces and the secret key. This leads us to derive upper bounds on the mutual information that are as tight as possible and can be easily calculated. It turns out that, in the case of an additive white Gaussian noise, the bound on the probability of success of any attack is directly related to the signal to noise ratio.
This leads to very easy computations and predictions of the success rate in any leakage model
Contraction of Locally Differentially Private Mechanisms
We investigate the contraction properties of locally differentially private
mechanisms. More specifically, we derive tight upper bounds on the divergence
between and output distributions of an
-LDP mechanism in terms of a divergence between the
corresponding input distributions and , respectively. Our first main
technical result presents a sharp upper bound on the -divergence
in terms of and
. We also show that the same result holds for a large family of
divergences, including KL-divergence and squared Hellinger distance. The second
main technical result gives an upper bound on
in terms of total variation distance
and . We then utilize these bounds to
establish locally private versions of the van Trees inequality, Le Cam's,
Assouad's, and the mutual information methods, which are powerful tools for
bounding minimax estimation risks. These results are shown to lead to better
privacy analyses than the state-of-the-arts in several statistical problems
such as entropy and discrete distribution estimation, non-parametric density
estimation, and hypothesis testing
Multi-User Privacy Mechanism Design with Non-zero Leakage
A privacy mechanism design problem is studied through the lens of information
theory. In this work, an agent observes useful data that is
correlated with private data which is assumed to be also
accessible by the agent. Here, we consider users where user demands a
sub-vector of , denoted by . The agent wishes to disclose to
user . Since is correlated with it can not be disclosed
directly. A privacy mechanism is designed to generate disclosed data which
maximizes a linear combinations of the users utilities while satisfying a
bounded privacy constraint in terms of mutual information. In a similar work it
has been assumed that is a deterministic function of , however in
this work we let and be arbitrarily correlated. First, an upper
bound on the privacy-utility trade-off is obtained by using a specific
transformation, Functional Representation Lemma and Strong Functional
Representaion Lemma, then we show that the upper bound can be decomposed into
parallel problems. Next, lower bounds on privacy-utility trade-off are
derived using Functional Representation Lemma and Strong Functional
Representaion Lemma. The upper bound is tight within a constant and the lower
bounds assert that the disclosed data is independent of all
except one which we allocate the maximum allowed leakage to it. Finally, the
obtained bounds are studied in special cases.Comment: arXiv admin note: text overlap with arXiv:2205.04881,
arXiv:2201.0873
- …