55,241 research outputs found

    Cayley's hyperdeterminant, the principal minors of a symmetric matrix and the entropy region of 4 Gaussian random variables

    Get PDF
    It has recently been shown that there is a connection between Cayley's hypdeterminant and the principal minors of a symmetric matrix. With an eye towards characterizing the entropy region of jointly Gaussian random variables, we obtain three new results on the relationship between Gaussian random variables and the hyperdeterminant. The first is a new (determinant) formula for the 2×2×2 hyperdeterminant. The second is a new (transparent) proof of the fact that the principal minors of an ntimesn symmetric matrix satisfy the 2 × 2 × .... × 2 (n times) hyperdeterminant relations. The third is a minimal set of 5 equations that 15 real numbers must satisfy to be the principal minors of a 4×4 symmetric matrix

    On the Entropy Region of Discrete and Continuous Random Variables and Network Information Theory

    Get PDF
    We show that a large class of network information theory problems can be cast as convex optimization over the convex space of entropy vectors. A vector in 2^(n) - 1 dimensional space is called entropic if each of its entries can be regarded as the joint entropy of a particular subset of n random variables (note that any set of size n has 2^(n) - 1 nonempty subsets.) While an explicit characterization of the space of entropy vectors is well-known for n = 2, 3 random variables, it is unknown for n > 3 (which is why most network information theory problems are open.) We will construct inner bounds to the space of entropic vectors using tools such as quasi-uniform distributions, lattices, and Cayley's hyperdeterminant

    Nonextensive statistical mechanics and central limit theorems I - Convolution of independent random variables and q-product

    Full text link
    In this article we review the standard versions of the Central and of the Levy-Gnedenko Limit Theorems, and illustrate their application to the convolution of independent random variables associated with the distribution known as q-Gaussian. This distribution emerges upon extremisation of the nonadditive entropy, basis of nonextensive statistical mechanics. It has a finite variance for q 5/3. We exhibit that, in the case of (standard) independence, the q-Gaussian has either the Gaussian (if q 5/3) as its attractor in probability space. Moreover, we review a generalisation of the product, the q-product, which plays a central role in the approach of the specially correlated variables emerging within the nonextensive theory.Comment: 13 pages, 4 figures. To appear in the Proceedings of the conference CTNEXT07, Complexity, Metastability and Nonextensivity, Catania, Italy, 1-5 July 2007, Eds. S. Abe, H.J. Herrmann, P. Quarati, A. Rapisarda and C. Tsallis (American Institute of Physics, 2008) in pres

    Dependence Balance Based Outer Bounds for Gaussian Networks with Cooperation and Feedback

    Full text link
    We obtain new outer bounds on the capacity regions of the two-user multiple access channel with generalized feedback (MAC-GF) and the two-user interference channel with generalized feedback (IC-GF). These outer bounds are based on the idea of dependence balance which was proposed by Hekstra and Willems [1]. To illustrate the usefulness of our outer bounds, we investigate three different channel models. We first consider a Gaussian MAC with noisy feedback (MAC-NF), where transmitter kk, k=1,2k=1,2, receives a feedback YFkY_{F_{k}}, which is the channel output YY corrupted with additive white Gaussian noise ZkZ_{k}. As the feedback noise variances become large, one would expect the feedback to become useless, which is not reflected by the cut-set bound. We demonstrate that our outer bound improves upon the cut-set bound for all non-zero values of the feedback noise variances. Moreover, in the limit as σZk2\sigma_{Z_{k}}^{2}\to \infty, k=1,2k=1,2, our outer bound collapses to the capacity region of the Gaussian MAC without feedback. Secondly, we investigate a Gaussian MAC with user-cooperation (MAC-UC), where each transmitter receives an additive white Gaussian noise corrupted version of the channel input of the other transmitter [2]. For this channel model, the cut-set bound is sensitive to the cooperation noises, but not sensitive enough. For all non-zero values of cooperation noise variances, our outer bound strictly improves upon the cut-set outer bound. Thirdly, we investigate a Gaussian IC with user-cooperation (IC-UC). For this channel model, the cut-set bound is again sensitive to cooperation noise variances but not sensitive enough. We demonstrate that our outer bound strictly improves upon the cut-set bound for all non-zero values of cooperation noise variances.Comment: Submitted to IEEE Transactions on Information Theor

    Secure Multiterminal Source Coding with Side Information at the Eavesdropper

    Full text link
    The problem of secure multiterminal source coding with side information at the eavesdropper is investigated. This scenario consists of a main encoder (referred to as Alice) that wishes to compress a single source but simultaneously satisfying the desired requirements on the distortion level at a legitimate receiver (referred to as Bob) and the equivocation rate --average uncertainty-- at an eavesdropper (referred to as Eve). It is further assumed the presence of a (public) rate-limited link between Alice and Bob. In this setting, Eve perfectly observes the information bits sent by Alice to Bob and has also access to a correlated source which can be used as side information. A second encoder (referred to as Charlie) helps Bob in estimating Alice's source by sending a compressed version of its own correlated observation via a (private) rate-limited link, which is only observed by Bob. For instance, the problem at hands can be seen as the unification between the Berger-Tung and the secure source coding setups. Inner and outer bounds on the so called rates-distortion-equivocation region are derived. The inner region turns to be tight for two cases: (i) uncoded side information at Bob and (ii) lossless reconstruction of both sources at Bob --secure distributed lossless compression. Application examples to secure lossy source coding of Gaussian and binary sources in the presence of Gaussian and binary/ternary (resp.) side informations are also considered. Optimal coding schemes are characterized for some cases of interest where the statistical differences between the side information at the decoders and the presence of a non-zero distortion at Bob can be fully exploited to guarantee secrecy.Comment: 26 pages, 16 figures, 2 table
    corecore