6 research outputs found

    Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks

    Get PDF
    In this paper, we consider the problem of remote vector Gaussian source coding for a wireless acoustic sensor network. Each node receives messages from multiple nodes in the network and decodes these messages using its own measurement of the sound field as side information. The node's measurement and the estimates of the source resulting from decoding the received messages are then jointly encoded and transmitted to a neighboring node in the network. We show that for this distributed source coding scenario, one can encode a so-called conditional sufficient statistic of the sources instead of jointly encoding multiple sources. We focus on the case where node measurements are in form of noisy linearly mixed combinations of the sources and the acoustic channel mixing matrices are invertible. For this problem, we derive the rate-distortion function for vector Gaussian sources and under covariance distortion constraints.Comment: 10 pages, to be presented at the IEEE DCC'1

    Distributed Remote Vector Gaussian Source Coding with Covariance Distortion Constraints

    Full text link
    In this paper, we consider a distributed remote source coding problem, where a sequence of observations of source vectors is available at the encoder. The problem is to specify the optimal rate for encoding the observations subject to a covariance matrix distortion constraint and in the presence of side information at the decoder. For this problem, we derive lower and upper bounds on the rate-distortion function (RDF) for the Gaussian case, which in general do not coincide. We then provide some cases, where the RDF can be derived exactly. We also show that previous results on specific instances of this problem can be generalized using our results. We finally show that if the distortion measure is the mean squared error, or if it is replaced by a certain mutual information constraint, the optimal rate can be derived from our main result.Comment: This is the final version accepted at ISIT'1

    Source Coding in Networks with Covariance Distortion Constraints

    Get PDF
    We consider a source coding problem with a network scenario in mind, and formulate it as a remote vector Gaussian Wyner-Ziv problem under covariance matrix distortions. We define a notion of minimum for two positive-definite matrices based on which we derive an explicit formula for the rate-distortion function (RDF). We then study the special cases and applications of this result. We show that two well-studied source coding problems, i.e. remote vector Gaussian Wyner-Ziv problems with mean-squared error and mutual information constraints are in fact special cases of our results. Finally, we apply our results to a joint source coding and denoising problem. We consider a network with a centralized topology and a given weighted sum-rate constraint, where the received signals at the center are to be fused to maximize the output SNR while enforcing no linear distortion. We show that one can design the distortion matrices at the nodes in order to maximize the output SNR at the fusion center. We thereby bridge between denoising and source coding within this setup

    Distributed Vector Gaussian Source-Coding And Distributed Hypothesis Testing

    Full text link
    Distributed compression is desired in applications in which data is collected in a distributed manner by several sensors and information about the data is sent to a processing center, which uses these information to meet an end goal. In this work, we focus on two such applications: (1) distributed source coding and (2) distributed hypothesis testing. In distributed source coding, we determine the rate region of the vector Gaussian one-helper source-coding problem under a covariance matrix distortion constraint. The rate region is achieved by a simple scheme that separates the lossy vector quantization from the lossless spatial compression. We introduce a novel analysis technique, namely distortion projection. The converse is established by combining distortion projection with two other analysis techniques that have been employed in the past to obtain partial results for the problem. We also study an extension to a special case of the problem in which the primary source is a vector and the helper's observation is a scalar and consider separate distortion constraints on both sources. We provide an outer bound to the rate region of this problem and show that it is partially tight in general and completely tight in some nontrivial cases. In distributed hypothesis testing, we study a problem in which data is compressed distributively and sent to a detector that seeks to decide between two possible distributions for the data. The aim is to characterize all achievable en- coding rates and exponents of the type 2 error probability when the type 1 error probability is at most a fixed value. For related problems in distributed source coding, schemes based on random binning perform well and are often optimal. For distributed hypothesis testing, however, the use of binning is hindered by the fact that the overall error probability may be dominated by errors in the binning process. We show that despite this complication, binning is optimal for a class of problems in which the goal is to "test against conditional independence." We then use this optimality result to give an outer bound for a more general class of instances of the problem. We also extend the "test against independence" result of Ahlswede and Csisz´ r to the vector Gaussian case.
    corecore