3,550 research outputs found
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
Wyner-Ziv Coding over Broadcast Channels: Digital Schemes
This paper addresses lossy transmission of a common source over a broadcast
channel when there is correlated side information at the receivers, with
emphasis on the quadratic Gaussian and binary Hamming cases. A digital scheme
that combines ideas from the lossless version of the problem, i.e.,
Slepian-Wolf coding over broadcast channels, and dirty paper coding, is
presented and analyzed. This scheme uses layered coding where the common layer
information is intended for both receivers and the refinement information is
destined only for one receiver. For the quadratic Gaussian case, a quantity
characterizing the overall quality of each receiver is identified in terms of
channel and side information parameters. It is shown that it is more
advantageous to send the refinement information to the receiver with "better"
overall quality. In the case where all receivers have the same overall quality,
the presented scheme becomes optimal. Unlike its lossless counterpart, however,
the problem eludes a complete characterization
Precoded Integer-Forcing Universally Achieves the MIMO Capacity to Within a Constant Gap
An open-loop single-user multiple-input multiple-output communication scheme
is considered where a transmitter, equipped with multiple antennas, encodes the
data into independent streams all taken from the same linear code. The coded
streams are then linearly precoded using the encoding matrix of a perfect
linear dispersion space-time code. At the receiver side, integer-forcing
equalization is applied, followed by standard single-stream decoding. It is
shown that this communication architecture achieves the capacity of any
Gaussian multiple-input multiple-output channel up to a gap that depends only
on the number of transmit antennas.Comment: to appear in the IEEE Transactions on Information Theor
Secure Multiterminal Source Coding with Side Information at the Eavesdropper
The problem of secure multiterminal source coding with side information at
the eavesdropper is investigated. This scenario consists of a main encoder
(referred to as Alice) that wishes to compress a single source but
simultaneously satisfying the desired requirements on the distortion level at a
legitimate receiver (referred to as Bob) and the equivocation rate --average
uncertainty-- at an eavesdropper (referred to as Eve). It is further assumed
the presence of a (public) rate-limited link between Alice and Bob. In this
setting, Eve perfectly observes the information bits sent by Alice to Bob and
has also access to a correlated source which can be used as side information. A
second encoder (referred to as Charlie) helps Bob in estimating Alice's source
by sending a compressed version of its own correlated observation via a
(private) rate-limited link, which is only observed by Bob. For instance, the
problem at hands can be seen as the unification between the Berger-Tung and the
secure source coding setups. Inner and outer bounds on the so called
rates-distortion-equivocation region are derived. The inner region turns to be
tight for two cases: (i) uncoded side information at Bob and (ii) lossless
reconstruction of both sources at Bob --secure distributed lossless
compression. Application examples to secure lossy source coding of Gaussian and
binary sources in the presence of Gaussian and binary/ternary (resp.) side
informations are also considered. Optimal coding schemes are characterized for
some cases of interest where the statistical differences between the side
information at the decoders and the presence of a non-zero distortion at Bob
can be fully exploited to guarantee secrecy.Comment: 26 pages, 16 figures, 2 table
- …