3 research outputs found
Beyond the Central Limit Theorem: Universal and Non-universal Simulations of Random Variables by General Mappings
Motivated by the Central Limit Theorem, in this paper, we study both
universal and non-universal simulations of random variables with an arbitrary
target distribution by general mappings, not limited to linear ones (as
in the Central Limit Theorem). We derive the fastest convergence rate of the
approximation errors for such problems. Interestingly, we show that for
discontinuous or absolutely continuous , the approximation error for the
universal simulation is almost as small as that for the non-universal one; and
moreover, for both universal and non-universal simulations, the approximation
errors by general mappings are strictly smaller than those by linear mappings.
Furthermore, we also generalize these results to simulation from Markov
processes, and simulation of random elements (or general random variables).Comment: 25 page
On Exact and -R\'enyi Common Informations
Recently, two extensions of Wyner's common information\textemdash exact and
R\'enyi common informations\textemdash were introduced respectively by Kumar,
Li, and El Gamal (KLE), and the present authors. The class of common
information problems involves determining the minimum rate of the common input
to two independent processors needed to exactly or approximately generate a
target joint distribution. For the exact common information problem, exact
generation of the target distribution is required, while for Wyner's and
-R\'enyi common informations, the relative entropy and R\'enyi
divergence with order were respectively used to quantify the
discrepancy between the synthesized and target distributions. The exact common
information is larger than or equal to Wyner's common information. However, it
was hitherto unknown whether the former is strictly larger than the latter for
some joint distributions. In this paper, we first establish the equivalence
between the exact and -R\'enyi common informations, and then provide
single-letter upper and lower bounds for these two quantities. For doubly
symmetric binary sources, we show that the upper and lower bounds coincide,
which implies that for such sources, the exact and -R\'enyi common
informations are completely characterized. Interestingly, we observe that for
such sources, these two common informations are strictly larger than Wyner's.
This answers an open problem posed by KLE. Furthermore, we extend Wyner's,
-R\'enyi, and exact common informations to sources with countably
infinite or continuous alphabets, including Gaussian sources.Comment: 42 page
R\'enyi Resolvability and Its Applications to the Wiretap Channel
The conventional channel resolvability problem refers to the determination of
the minimum rate required for an input process so that the output distribution
approximates a target distribution in either the total variation distance or
the relative entropy. In contrast to previous works, in this paper, we use the
(normalized or unnormalized) R\'enyi divergence (with the R\'enyi parameter in
) to measure the level of approximation. We also provide
asymptotic expressions for normalized R\'enyi divergence when the R\'enyi
parameter is larger than or equal to as well as (lower and upper) bounds
for the case when the same parameter is smaller than . We characterize the
R\'enyi resolvability, which is defined as the minimum rate required to ensure
that the R\'enyi divergence vanishes asymptotically. The R\'enyi
resolvabilities are the same for both the normalized and unnormalized
divergence cases. In addition, when the R\'enyi parameter smaller than~,
consistent with the traditional case where the R\'enyi parameter is equal
to~, the R\'enyi resolvability equals the minimum mutual information over
all input distributions that induce the target output distribution. When the
R\'enyi parameter is larger than the R\'enyi resolvability is, in general,
larger than the mutual information. The optimal R\'enyi divergence is proven to
vanish at least exponentially fast for both of these two cases, as long as the
code rate is larger than the R\'enyi resolvability. The optimal exponential
rate of decay for i.i.d.\ random codes is also characterized exactly. We apply
these results to the wiretap channel, and completely characterize the optimal
tradeoff between the rates of the secret and non-secret messages when the
leakage measure is given by the (unnormalized) R\'enyi divergence.Comment: 37 pages. To appear in IEEE Transactions on Information Theor