12,061 research outputs found
Source Coding in Networks with Covariance Distortion Constraints
We consider a source coding problem with a network scenario in mind, and
formulate it as a remote vector Gaussian Wyner-Ziv problem under covariance
matrix distortions. We define a notion of minimum for two positive-definite
matrices based on which we derive an explicit formula for the rate-distortion
function (RDF). We then study the special cases and applications of this
result. We show that two well-studied source coding problems, i.e. remote
vector Gaussian Wyner-Ziv problems with mean-squared error and mutual
information constraints are in fact special cases of our results. Finally, we
apply our results to a joint source coding and denoising problem. We consider a
network with a centralized topology and a given weighted sum-rate constraint,
where the received signals at the center are to be fused to maximize the output
SNR while enforcing no linear distortion. We show that one can design the
distortion matrices at the nodes in order to maximize the output SNR at the
fusion center. We thereby bridge between denoising and source coding within
this setup
Zero-Delay Rate Distortion via Filtering for Vector-Valued Gaussian Sources
We deal with zero-delay source coding of a vector-valued Gauss-Markov source
subject to a mean-squared error (MSE) fidelity criterion characterized by the
operational zero-delay vector-valued Gaussian rate distortion function (RDF).
We address this problem by considering the nonanticipative RDF (NRDF) which is
a lower bound to the causal optimal performance theoretically attainable (OPTA)
function and operational zero-delay RDF. We recall the realization that
corresponds to the optimal "test-channel" of the Gaussian NRDF, when
considering a vector Gauss-Markov source subject to a MSE distortion in the
finite time horizon. Then, we introduce sufficient conditions to show existence
of solution for this problem in the infinite time horizon. For the asymptotic
regime, we use the asymptotic characterization of the Gaussian NRDF to provide
a new equivalent realization scheme with feedback which is characterized by a
resource allocation (reverse-waterfilling) problem across the dimension of the
vector source. We leverage the new realization to derive a predictive coding
scheme via lattice quantization with subtractive dither and joint memoryless
entropy coding. This coding scheme offers an upper bound to the operational
zero-delay vector-valued Gaussian RDF. When we use scalar quantization, then
for "r" active dimensions of the vector Gauss-Markov source the gap between the
obtained lower and theoretical upper bounds is less than or equal to 0.254r + 1
bits/vector. We further show that it is possible when we use vector
quantization, and assume infinite dimensional Gauss-Markov sources to make the
previous gap to be negligible, i.e., Gaussian NRDF approximates the operational
zero-delay Gaussian RDF. We also extend our results to vector-valued Gaussian
sources of any finite memory under mild conditions. Our theoretical framework
is demonstrated with illustrative numerical experiments.Comment: 32 pages, 9 figures, published in IEEE Journal of Selected Topics in
Signal Processin
Distributed Remote Vector Gaussian Source Coding for Wireless Acoustic Sensor Networks
In this paper, we consider the problem of remote vector Gaussian source
coding for a wireless acoustic sensor network. Each node receives messages from
multiple nodes in the network and decodes these messages using its own
measurement of the sound field as side information. The node's measurement and
the estimates of the source resulting from decoding the received messages are
then jointly encoded and transmitted to a neighboring node in the network. We
show that for this distributed source coding scenario, one can encode a
so-called conditional sufficient statistic of the sources instead of jointly
encoding multiple sources. We focus on the case where node measurements are in
form of noisy linearly mixed combinations of the sources and the acoustic
channel mixing matrices are invertible. For this problem, we derive the
rate-distortion function for vector Gaussian sources and under covariance
distortion constraints.Comment: 10 pages, to be presented at the IEEE DCC'1
Cooperative Transmission for a Vector Gaussian Parallel Relay Network
In this paper, we consider a parallel relay network where two relays
cooperatively help a source transmit to a destination. We assume the source and
the destination nodes are equipped with multiple antennas. Three basic schemes
and their achievable rates are studied: Decode-and-Forward (DF),
Amplify-and-Forward (AF), and Compress-and-Forward (CF). For the DF scheme, the
source transmits two private signals, one for each relay, where dirty paper
coding (DPC) is used between the two private streams, and a common signal for
both relays. The relays make efficient use of the common information to
introduce a proper amount of correlation in the transmission to the
destination. We show that the DF scheme achieves the capacity under certain
conditions. We also show that the CF scheme is asymptotically optimal in the
high relay power limit, regardless of channel ranks. It turns out that the AF
scheme also achieves the asymptotic optimality but only when the
relays-to-destination channel is full rank. The relative advantages of the
three schemes are discussed with numerical results.Comment: 35 pages, 10 figures, submitted to IEEE Transactions on Information
Theor
- …