164,245 research outputs found
Joint Source-Channel Coding with Time-Varying Channel and Side-Information
Transmission of a Gaussian source over a time-varying Gaussian channel is
studied in the presence of time-varying correlated side information at the
receiver. A block fading model is considered for both the channel and the side
information, whose states are assumed to be known only at the receiver. The
optimality of separate source and channel coding in terms of average end-to-end
distortion is shown when the channel is static while the side information state
follows a discrete or a continuous and quasiconcave distribution. When both the
channel and side information states are time-varying, separate source and
channel coding is suboptimal in general. A partially informed encoder lower
bound is studied by providing the channel state information to the encoder.
Several achievable transmission schemes are proposed based on uncoded
transmission, separate source and channel coding, joint decoding as well as
hybrid digital-analog transmission. Uncoded transmission is shown to be optimal
for a class of continuous and quasiconcave side information state
distributions, while the channel gain may have an arbitrary distribution. To
the best of our knowledge, this is the first example in which the uncoded
transmission achieves the optimal performance thanks to the time-varying nature
of the states, while it is suboptimal in the static version of the same
problem. Then, the optimal \emph{distortion exponent}, that quantifies the
exponential decay rate of the expected distortion in the high SNR regime, is
characterized for Nakagami distributed channel and side information states, and
it is shown to be achieved by hybrid digital-analog and joint decoding schemes
in certain cases, illustrating the suboptimality of pure digital or analog
transmission in general.Comment: Submitted to IEEE Transactions on Information Theor
Side information aware source and channel coding in wireless networks
Signals in communication networks exhibit significant correlation, which can stem from the physical nature of the underlying sources, or can be created within the system. Current layered network architectures, in which, based on Shannon’s separation theorem, data is compressed and transmitted over independent bit-pipes, are in general not able to exploit such correlation efficiently. Moreover, this strictly layered architecture was developed for wired networks and ignore the broadcast and highly dynamic nature of the wireless medium, creating a bottleneck in the wireless network design. Technologies that exploit correlated information and go beyond the layered network architecture can become a key feature of future wireless networks, as information theory promises significant gains. In this thesis, we study from an information theoretic perspective, three distinct, yet fundamental, problems involving the availability of correlated information in wireless networks and develop novel communication techniques to exploit it efficiently. We first look at two joint source-channel coding problems involving the lossy transmission of Gaussian sources in a multi-terminal and a time-varying setting in which correlated side information is present in the network. In these two problems, the optimality of Shannon’s separation breaks down and separate source and channel coding is shown to perform poorly compared to the proposed joint source-channel coding designs, which are shown to achieve the optimal performance in some setups. Then, we characterize the capacity of a class of orthogonal relay channels in the presence of channel side information at the destination, and show that joint decoding and compression of the received signal at the relay is required to optimally exploit the available side information. Our results in these three different scenarios emphasize the benefits of exploiting correlated side information at the destination when designing a communication system, even though the nature of the side information and the performance measure in the three scenarios are quite different.Open Acces
Secure Transmission of Sources over Noisy Channels with Side Information at the Receivers
This paper investigates the problem of source-channel coding for secure
transmission with arbitrarily correlated side informations at both receivers.
This scenario consists of an encoder (referred to as Alice) that wishes to
compress a source and send it through a noisy channel to a legitimate receiver
(referred to as Bob). In this context, Alice must simultaneously satisfy the
desired requirements on the distortion level at Bob, and the equivocation rate
at the eavesdropper (referred to as Eve). This setting can be seen as a
generalization of the problems of secure source coding with (uncoded) side
information at the decoders, and the wiretap channel. A general outer bound on
the rate-distortion-equivocation region, as well as an inner bound based on a
pure digital scheme, is derived for arbitrary channels and side informations.
In some special cases of interest, it is proved that this digital scheme is
optimal and that separation holds. However, it is also shown through a simple
counterexample with a binary source that a pure analog scheme can outperform
the digital one while being optimal. According to these observations and
assuming matched bandwidth, a novel hybrid digital/analog scheme that aims to
gather the advantages of both digital and analog ones is then presented. In the
quadratic Gaussian setup when side information is only present at the
eavesdropper, this strategy is proved to be optimal. Furthermore, it
outperforms both digital and analog schemes, and cannot be achieved via
time-sharing. By means of an appropriate coding, the presence of any
statistical difference among the side informations, the channel noises, and the
distortion at Bob can be fully exploited in terms of secrecy.Comment: To appear in IEEE Transactions on Information Theor
Robust Signaling for Bursty Interference
This paper studies a bursty interference channel, where the presence/absence
of interference is modeled by a block-i.i.d.\ Bernoulli process that stays
constant for a duration of symbols (referred to as coherence block) and
then changes independently to a new state. We consider both a quasi-static
setup, where the interference state remains constant during the whole
transmission of the codeword, and an ergodic setup, where a codeword spans
several coherence blocks. For the quasi-static setup, we study the largest rate
of a coding strategy that provides reliable communication at a basic rate and
allows an increased (opportunistic) rate when there is no interference. For the
ergodic setup, we study the largest achievable rate. We study how non-causal
knowledge of the interference state, referred to as channel-state information
(CSI), affects the achievable rates. We derive converse and achievability
bounds for (i) local CSI at the receiver-side only; (ii) local CSI at the
transmitter- and receiver-side, and (iii) global CSI at all nodes. Our bounds
allow us to identify when interference burstiness is beneficial and in which
scenarios global CSI outperforms local CSI. The joint treatment of the
quasi-static and ergodic setup further allows for a thorough comparison of
these two setups.Comment: 67 pages, 39 figure
Information theoretic treatment of tripartite systems and quantum channels
A Holevo measure is used to discuss how much information about a given POVM
on system is present in another system , and how this influences the
presence or absence of information about a different POVM on in a third
system . The main goal is to extend information theorems for mutually
unbiased bases or general bases to arbitrary POVMs, and especially to
generalize "all-or-nothing" theorems about information located in tripartite
systems to the case of \emph{partial information}, in the form of quantitative
inequalities. Some of the inequalities can be viewed as entropic uncertainty
relations that apply in the presence of quantum side information, as in recent
work by Berta et al. [Nature Physics 6, 659 (2010)]. All of the results also
apply to quantum channels: e.g., if \EC accurately transmits certain POVMs,
the complementary channel \FC will necessarily be noisy for certain other
POVMs. While the inequalities are valid for mixed states of tripartite systems,
restricting to pure states leads to the basis-invariance of the difference
between the information about contained in and .Comment: 21 pages. An earlier version of this paper attempted to prove our
main uncertainty relation, Theorem 5, using the achievability of the Holevo
quantity in a coding task, an approach that ultimately failed because it did
not account for locking of classical correlations, e.g. see [DiVincenzo et
al. PRL. 92, 067902 (2004)]. In the latest version, we use a very different
approach to prove Theorem
Cross Layer Coding Schemes for Broadcasting and Relaying
This dissertation is divided into two main topics. In the first topic, we study the
joint source-channel coding problem of transmitting an analog source over a Gaussian
channel in two cases - (i) the presence of interference known only to the transmitter and (ii) in the presence of side information about the source known only to the
receiver. We introduce hybrid digital analog forms of the Costa and Wyner-Ziv coding schemes. We present random coding based schemes in contrast to lattice based
schemes proposed by Kochman and Zamir. We also discuss superimposed digital and
analog schemes for the above problems which show that there are infinitely many
schemes for achieving the optimal distortion for these problems. This provides an
extension of the schemes proposed by Bross and others to the interference/source
side information case. The result of this study shows that the proposed hybrid digital analog schemes are more robust to a mismatch in channel signal-to-noise ratio
(SNR), than pure separate source coding followed by channel coding solutions. We
then discuss applications of the hybrid digital analog schemes for transmitting under
a channel SNR mismatch and for broadcasting a Gaussian source with bandwidth
compression. We also study applications of joint source-channel coding schemes for
a cognitive setup and also for the setup of transmitting an analog Gaussian source
over a Gaussian channel, in the presence of an eavesdropper.
In the next topic, we consider joint physical layer coding and network coding
solutions for bi-directional relaying. We consider a communication system where two transmitters wish to exchange information through a central relay. The transmitter
and relay nodes exchange data over synchronized, average power constrained additive
white Gaussian noise channels. We propose structured coding schemes using lattices
for this problem. We study two decoding approaches, namely lattice decoding and
minimum angle decoding. Both the decoding schemes can be shown to achieve the
upper bound at high SNRs. The proposed scheme can be thought of as a joint physical
layer, network layer code which outperforms other recently proposed analog network
coding schemes. We also study extensions of the bi-directional relay for the case with
asymmetric channel links and also for the multi-hop case. The result of this study
shows that structured coding schemes using lattices perform close to the upper bound
for the above communication system models
- …