232 research outputs found

    Entropy and the Law of Small Numbers

    Get PDF
    Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=∑i=1nXiS_n=\sum_{i=1}^nX_i is the sum of the (possibly dependent) binary random variables X1,X2,...,XnX_1,X_2,...,X_n, with E(Xi)=piE(X_i)=p_i and E(S_n)=\la, then \ben D(P_{S_n}\|\Pol)\leq \sum_{i=1}^n p_i^2 + \Big[\sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)\Big], \een where D(P_{S_n}\|{Po}(\la)) is the relative entropy between the distribution of SnS_n and the Poisson(\la) distribution. The first term in this bound measures the individual smallness of the XiX_i and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the XiX_i are independent, the following sharper bound is established, \ben D(P_{S_n}\|\Pol)\leq \frac{1}{\lambda} \sum_{i=1}^n \frac{p_i^3}{1-p_i}, % \label{eq:abs2} \een and it is also generalized to the case when the XiX_i are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.Comment: 15 pages. To appear, IEEE Trans Inform Theor

    Independence clustering (without a matrix)

    Full text link
    The independence clustering problem is considered in the following formulation: given a set SS of random variables, it is required to find the finest partitioning {U1,…,Uk}\{U_1,\dots,U_k\} of SS into clusters such that the clusters U1,…,UkU_1,\dots,U_k are mutually independent. Since mutual independence is the target, pairwise similarity measurements are of no use, and thus traditional clustering algorithms are inapplicable. The distribution of the random variables in SS is, in general, unknown, but a sample is available. Thus, the problem is cast in terms of time series. Two forms of sampling are considered: i.i.d.\ and stationary time series, with the main emphasis being on the latter, more general, case. A consistent, computationally tractable algorithm for each of the settings is proposed, and a number of open directions for further research are outlined

    The Arbitrarily Varying Broadcast Channel with Degraded Message Sets with Causal Side Information at the Encoder

    Full text link
    In this work, we study the arbitrarily varying broadcast channel (AVBC), when state information is available at the transmitter in a causal manner. We establish inner and outer bounds on both the random code capacity region and the deterministic code capacity region with degraded message sets. The capacity region is then determined for a class of channels satisfying a condition on the mutual informations between the strategy variables and the channel outputs. As an example, we consider the arbitrarily varying binary symmetric broadcast channel with correlated noises. We show cases where the condition holds, hence the capacity region is determined, and other cases where there is a gap between the bounds.Comment: arXiv admin note: substantial text overlap with arXiv:1701.0334

    On Multistage Successive Refinement for Wyner-Ziv Source Coding with Degraded Side Informations

    Get PDF
    We provide a complete characterization of the rate-distortion region for the multistage successive refinement of the Wyner-Ziv source coding problem with degraded side informations at the decoder. Necessary and sufficient conditions for a source to be successively refinable along a distortion vector are subsequently derived. A source-channel separation theorem is provided when the descriptions are sent over independent channels for the multistage case. Furthermore, we introduce the notion of generalized successive refinability with multiple degraded side informations. This notion captures whether progressive encoding to satisfy multiple distortion constraints for different side informations is as good as encoding without progressive requirement. Necessary and sufficient conditions for generalized successive refinability are given. It is shown that the following two sources are generalized successively refinable: (1) the Gaussian source with degraded Gaussian side informations, (2) the doubly symmetric binary source when the worse side information is a constant. Thus for both cases, the failure of being successively refinable is only due to the inherent uncertainty on which side information will occur at the decoder, but not the progressive encoding requirement.Comment: Submitted to IEEE Trans. Information Theory Apr. 200

    A Formula for the Capacity of the General Gel'fand-Pinsker Channel

    Full text link
    We consider the Gel'fand-Pinsker problem in which the channel and state are general, i.e., possibly non-stationary, non-memoryless and non-ergodic. Using the information spectrum method and a non-trivial modification of the piggyback coding lemma by Wyner, we prove that the capacity can be expressed as an optimization over the difference of a spectral inf- and a spectral sup-mutual information rate. We consider various specializations including the case where the channel and state are memoryless but not necessarily stationary.Comment: Accepted to the IEEE Transactions on Communication
    • …
    corecore