1,535 research outputs found

    A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

    Get PDF
    We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d(x,x^)=|x−x^|r , with r ≥ 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log(√(πe)) ≈ 1.5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log(√((πe)/2)) ≈ 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log(√((πe)/2)) ≈ 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry

    Information Extraction Under Privacy Constraints

    Full text link
    A privacy-constrained information extraction problem is considered where for a pair of correlated discrete random variables (X,Y)(X,Y) governed by a given joint distribution, an agent observes YY and wants to convey to a potentially public user as much information about YY as possible without compromising the amount of information revealed about XX. To this end, the so-called {\em rate-privacy function} is introduced to quantify the maximal amount of information (measured in terms of mutual information) that can be extracted from YY under a privacy constraint between XX and the extracted information, where privacy is measured using either mutual information or maximal correlation. Properties of the rate-privacy function are analyzed and information-theoretic and estimation-theoretic interpretations of it are presented for both the mutual information and maximal correlation privacy measures. It is also shown that the rate-privacy function admits a closed-form expression for a large family of joint distributions of (X,Y)(X,Y). Finally, the rate-privacy function under the mutual information privacy measure is considered for the case where (X,Y)(X,Y) has a joint probability density function by studying the problem where the extracted information is a uniform quantization of YY corrupted by additive Gaussian noise. The asymptotic behavior of the rate-privacy function is studied as the quantization resolution grows without bound and it is observed that not all of the properties of the rate-privacy function carry over from the discrete to the continuous case.Comment: 55 pages, 6 figures. Improved the organization and added detailed literature revie

    Information Theoretic Proofs of Entropy Power Inequalities

    Full text link
    While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon's entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn's identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman's Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai and Verd\'u used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder's generalized EPI for linear transformations of the random variables, Takano and Johnson's EPI for dependent variables, Liu and Viswanath's covariance-constrained EPI, and Costa's concavity inequality for the entropy power.Comment: submitted for publication in the IEEE Transactions on Information Theory, revised versio

    Privacy-Preserving Anomaly Detection in Stochastic Dynamical Systems: Synthesis of Optimal Gaussian Mechanisms

    Full text link
    We present a framework for the design of distorting mechanisms that allow remotely operating anomaly detectors in a privacy-preserving fashion. We consider the problem setting in which a remote station seeks to identify anomalies using system input-output signals transmitted over communication networks. However, in such a networked setting, it is not desired to disclose true data of the system operation as it can be used to infer private information -- modeled here as a system private output. To prevent accurate estimation of private outputs by adversaries, we pass original signals through distorting (privacy-preserving) mechanisms and send the distorted data to the remote station (which inevitably leads to degraded monitoring performance). The design of these mechanisms is formulated as a privacy-utility (tradeoff) problem where system utility is characterized by anomaly detection performance, and privacy is quantified using information-theoretic metrics (mutual information and differential entropy). We cast the synthesis of dependent Gaussian mechanisms as the solution of a convex program (log-determinant cost with linear matrix inequality constraints) where we seek to maximize privacy over a finite window of realizations while guaranteeing a bound on monitoring performance degradation. We provide simulation results to illustrate the performance of the developed tools

    An entropy inequality for symmetric random variables

    Full text link
    We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables (X1,X2,…,Xn)(X_1, X_2, \dots, X_n) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X1,X2,…,Xn)(X_1, X_2, \dots, X_n). We show that for n≥3n \geq 3, the lower bound is tight if and only if XiX_i's are i.i.d.\ Gaussian random variables. For n=2n=2 there are numerous other cases of equality apart from i.i.d.\ Gaussians, which we completely characterize. Going beyond sums, we also present an inequality for certain linear transformations of (X1,…,Xn)(X_1, \dots, X_n). Our primary technical contribution lies in the analysis of the equality cases, and our approach relies on the geometry and the symmetry of the problem.Comment: submitted to ISIT 201
    • …
    corecore