1,649 research outputs found
On the Corner Points of the Capacity Region of a Two-User Gaussian Interference Channel
This work considers the corner points of the capacity region of a two-user
Gaussian interference channel (GIC). In a two-user GIC, the rate pairs where
one user transmits its data at the single-user capacity (without interference),
and the other at the largest rate for which reliable communication is still
possible are called corner points. This paper relies on existing outer bounds
on the capacity region of a two-user GIC that are used to derive informative
bounds on the corner points of the capacity region. The new bounds refer to a
weak two-user GIC (i.e., when both cross-link gains in standard form are
positive and below 1), and a refinement of these bounds is obtained for the
case where the transmission rate of one user is within of the
single-user capacity. The bounds on the corner points are asymptotically tight
as the transmitted powers tend to infinity, and they are also useful for the
case of moderate SNR and INR. Upper and lower bounds on the gap (denoted by
) between the sum-rate and the maximal achievable total rate at the two
corner points are derived. This is followed by an asymptotic analysis analogous
to the study of the generalized degrees of freedom (where the SNR and INR
scalings are coupled such that ), leading to an asymptotic characterization of this gap which is
exact for the whole range of . The upper and lower bounds on
are asymptotically tight in the sense that they achieve the exact asymptotic
characterization. Improved bounds on are derived for finite SNR and
INR, and their improved tightness is exemplified numerically.Comment: Submitted to the IEEE Trans. on Information Theory in July 17, 2014,
and revised in April 5, 2015. Presented in part at Allerton 2013, and also
presented in part with improved results at ISIT 201
Tight Bounds on the R\'enyi Entropy via Majorization with Applications to Guessing and Compression
This paper provides tight bounds on the R\'enyi entropy of a function of a
discrete random variable with a finite number of possible values, where the
considered function is not one-to-one. To that end, a tight lower bound on the
R\'enyi entropy of a discrete random variable with a finite support is derived
as a function of the size of the support, and the ratio of the maximal to
minimal probability masses. This work was inspired by the recently published
paper by Cicalese et al., which is focused on the Shannon entropy, and it
strengthens and generalizes the results of that paper to R\'enyi entropies of
arbitrary positive orders. In view of these generalized bounds and the works by
Arikan and Campbell, non-asymptotic bounds are derived for guessing moments and
lossless data compression of discrete memoryless sources.Comment: The paper was published in the Entropy journal (special issue on
Probabilistic Methods in Information Theory, Hypothesis Testing, and Coding),
vol. 20, no. 12, paper no. 896, November 22, 2018. Online available at
https://www.mdpi.com/1099-4300/20/12/89
On Universal Properties of Capacity-Approaching LDPC Ensembles
This paper is focused on the derivation of some universal properties of
capacity-approaching low-density parity-check (LDPC) code ensembles whose
transmission takes place over memoryless binary-input output-symmetric (MBIOS)
channels. Properties of the degree distributions, graphical complexity and the
number of fundamental cycles in the bipartite graphs are considered via the
derivation of information-theoretic bounds. These bounds are expressed in terms
of the target block/ bit error probability and the gap (in rate) to capacity.
Most of the bounds are general for any decoding algorithm, and some others are
proved under belief propagation (BP) decoding. Proving these bounds under a
certain decoding algorithm, validates them automatically also under any
sub-optimal decoding algorithm. A proper modification of these bounds makes
them universal for the set of all MBIOS channels which exhibit a given
capacity. Bounds on the degree distributions and graphical complexity apply to
finite-length LDPC codes and to the asymptotic case of an infinite block
length. The bounds are compared with capacity-approaching LDPC code ensembles
under BP decoding, and they are shown to be informative and are easy to
calculate. Finally, some interesting open problems are considered.Comment: Published in the IEEE Trans. on Information Theory, vol. 55, no. 7,
pp. 2956 - 2990, July 200
On the Entropy of Sums of Bernoulli Random Variables via the Chen-Stein Method
This paper considers the entropy of the sum of (possibly dependent and
non-identically distributed) Bernoulli random variables. Upper bounds on the
error that follows from an approximation of this entropy by the entropy of a
Poisson random variable with the same mean are derived. The derivation of these
bounds combines elements of information theory with the Chen-Stein method for
Poisson approximation. The resulting bounds are easy to compute, and their
applicability is exemplified. This conference paper presents in part the first
half of the paper entitled "An information-theoretic perspective of the Poisson
approximation via the Chen-Stein method" (see:arxiv:1206.6811). A
generalization of the bounds that considers the accuracy of the Poisson
approximation for the entropy of a sum of non-negative, integer-valued and
bounded random variables is introduced in the full paper. It also derives lower
bounds on the total variation distance, relative entropy and other measures
that are not considered in this conference paper.Comment: A conference paper of 5 pages that appears in the Proceedings of the
2012 IEEE International Workshop on Information Theory (ITW 2012), pp.
542--546, Lausanne, Switzerland, September 201
- …