1,869 research outputs found

    Distributed Vector Gaussian Source-Coding And Distributed Hypothesis Testing

    Full text link
    Distributed compression is desired in applications in which data is collected in a distributed manner by several sensors and information about the data is sent to a processing center, which uses these information to meet an end goal. In this work, we focus on two such applications: (1) distributed source coding and (2) distributed hypothesis testing. In distributed source coding, we determine the rate region of the vector Gaussian one-helper source-coding problem under a covariance matrix distortion constraint. The rate region is achieved by a simple scheme that separates the lossy vector quantization from the lossless spatial compression. We introduce a novel analysis technique, namely distortion projection. The converse is established by combining distortion projection with two other analysis techniques that have been employed in the past to obtain partial results for the problem. We also study an extension to a special case of the problem in which the primary source is a vector and the helper's observation is a scalar and consider separate distortion constraints on both sources. We provide an outer bound to the rate region of this problem and show that it is partially tight in general and completely tight in some nontrivial cases. In distributed hypothesis testing, we study a problem in which data is compressed distributively and sent to a detector that seeks to decide between two possible distributions for the data. The aim is to characterize all achievable en- coding rates and exponents of the type 2 error probability when the type 1 error probability is at most a fixed value. For related problems in distributed source coding, schemes based on random binning perform well and are often optimal. For distributed hypothesis testing, however, the use of binning is hindered by the fact that the overall error probability may be dominated by errors in the binning process. We show that despite this complication, binning is optimal for a class of problems in which the goal is to "test against conditional independence." We then use this optimality result to give an outer bound for a more general class of instances of the problem. We also extend the "test against independence" result of Ahlswede and Csisz´ r to the vector Gaussian case.

    Principles of Physical Layer Security in Multiuser Wireless Networks: A Survey

    Full text link
    This paper provides a comprehensive review of the domain of physical layer security in multiuser wireless networks. The essential premise of physical-layer security is to enable the exchange of confidential messages over a wireless medium in the presence of unauthorized eavesdroppers without relying on higher-layer encryption. This can be achieved primarily in two ways: without the need for a secret key by intelligently designing transmit coding strategies, or by exploiting the wireless communication medium to develop secret keys over public channels. The survey begins with an overview of the foundations dating back to the pioneering work of Shannon and Wyner on information-theoretic security. We then describe the evolution of secure transmission strategies from point-to-point channels to multiple-antenna systems, followed by generalizations to multiuser broadcast, multiple-access, interference, and relay networks. Secret-key generation and establishment protocols based on physical layer mechanisms are subsequently covered. Approaches for secrecy based on channel coding design are then examined, along with a description of inter-disciplinary approaches based on game theory and stochastic geometry. The associated problem of physical-layer message authentication is also introduced briefly. The survey concludes with observations on potential research directions in this area.Comment: 23 pages, 10 figures, 303 refs. arXiv admin note: text overlap with arXiv:1303.1609 by other authors. IEEE Communications Surveys and Tutorials, 201

    Source Coding in Networks with Covariance Distortion Constraints

    Get PDF
    We consider a source coding problem with a network scenario in mind, and formulate it as a remote vector Gaussian Wyner-Ziv problem under covariance matrix distortions. We define a notion of minimum for two positive-definite matrices based on which we derive an explicit formula for the rate-distortion function (RDF). We then study the special cases and applications of this result. We show that two well-studied source coding problems, i.e. remote vector Gaussian Wyner-Ziv problems with mean-squared error and mutual information constraints are in fact special cases of our results. Finally, we apply our results to a joint source coding and denoising problem. We consider a network with a centralized topology and a given weighted sum-rate constraint, where the received signals at the center are to be fused to maximize the output SNR while enforcing no linear distortion. We show that one can design the distortion matrices at the nodes in order to maximize the output SNR at the fusion center. We thereby bridge between denoising and source coding within this setup

    Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks

    Get PDF
    In this paper, we consider the problem of remote vector Gaussian source coding for a wireless acoustic sensor network. Each node receives messages from multiple nodes in the network and decodes these messages using its own measurement of the sound field as side information. The node's measurement and the estimates of the source resulting from decoding the received messages are then jointly encoded and transmitted to a neighboring node in the network. We show that for this distributed source coding scenario, one can encode a so-called conditional sufficient statistic of the sources instead of jointly encoding multiple sources. We focus on the case where node measurements are in form of noisy linearly mixed combinations of the sources and the acoustic channel mixing matrices are invertible. For this problem, we derive the rate-distortion function for vector Gaussian sources and under covariance distortion constraints.Comment: 10 pages, to be presented at the IEEE DCC'1

    Gaussian Secure Source Coding and Wyner's Common Information

    Full text link
    We study secure source-coding with causal disclosure, under the Gaussian distribution. The optimality of Gaussian auxiliary random variables is shown in various scenarios. We explicitly characterize the tradeoff between the rates of communication and secret key. This tradeoff is the result of a mutual information optimization under Markov constraints. As a corollary, we deduce a general formula for Wyner's Common Information in the Gaussian setting.Comment: ISIT 2015, 5 pages, uses IEEEtran.cl
    • …
    corecore