27,606 research outputs found
Correlated Sources In Distributed Networks - Data Transmission, Common Information Characterization and Inferencing
Correlation is often present among observations in a distributed system. This thesis deals with various design issues when correlated data are observed at distributed terminals, including: communicating correlated sources over interference channels, characterizing the common information among dependent random variables, and testing the presence of dependence among observations.
It is well known that separated source and channel coding is optimal for point-to-point communication. However, this is not the case for multi-terminal communications. In this thesis, we study the problem of communicating correlated sources over interference channels (IC), for both the lossless and the lossy case. For lossless case, a sufficient condition is found using the technique of random source partition and correlation preserving codeword generation. The sufficient condition reduces to the Han-Kobayashi achievable rate region for IC with independent observations. Moreover, the proposed coding scheme is optimal for transmitting a special correlated sources over a class of deterministic interference channels. We then study the general case of lossy transmission of two correlated sources over a two-user discrete memoryless interference channel (DMIC). An achievable distortion region is obtained and Gaussian examples are studied.
The second topic is the generalization of Wyner\u27s definition of common information of a pair of random variables to that of N random variables. Coding theorems are obtained to show that the same operational meanings for the common information of two random variables apply to that of N random variables. We establish a monotone property of Wyner\u27s common information which is in contrast to other notions of the common information, specifically Shannon\u27s mutual information and G\u27{a}cs and K {o}rner\u27s common randomness. Later, we extend Wyner\u27s common information to that of continuous random variables and provide an operational meaning using the Gray-Wyner network with lossy source coding. We show that Wyner\u27s common information equals the smallest common message rate when the total rate is arbitrarily close to the rate-distortion function with joint decoding.
Finally, we consider the problem of distributed test of statistical independence under communication constraints. Focusing on the Gaussian case because of its tractability, we study in this thesis the characteristics of optimal scalar quantizers for distributed test of independence where the optimality is both in the finite sample regime and in the asymptotic regime
Communicating Correlated Sources Over an Interference Channel
A new coding technique, based on \textit{fixed block-length} codes, is
proposed for the problem of communicating a pair of correlated sources over a
user interference channel. Its performance is analyzed to derive a new set
of sufficient conditions. The latter is proven to be strictly less binding than
the current known best, which is due to Liu and Chen [Dec, 2011]. Our findings
are inspired by Dueck's example [March, 1981]
Source-Channel Coding Theorems for the Multiple-Access Relay Channel
We study reliable transmission of arbitrarily correlated sources over
multiple-access relay channels (MARCs) and multiple-access broadcast relay
channels (MABRCs). In MARCs only the destination is interested in
reconstructing the sources, while in MABRCs both the relay and the destination
want to reconstruct them. In addition to arbitrary correlation among the source
signals at the users, both the relay and the destination have side information
correlated with the source signals. Our objective is to determine whether a
given pair of sources can be losslessly transmitted to the destination for a
given number of channel symbols per source sample, defined as the
source-channel rate. Sufficient conditions for reliable communication based on
operational separation, as well as necessary conditions on the achievable
source-channel rates are characterized. Since operational separation is
generally not optimal for MARCs and MABRCs, sufficient conditions for reliable
communication using joint source-channel coding schemes based on a combination
of the correlation preserving mapping technique with Slepian-Wolf source coding
are also derived. For correlated sources transmitted over fading Gaussian MARCs
and MABRCs, we present conditions under which separation (i.e., separate and
stand-alone source and channel codes) is optimal. This is the first time
optimality of separation is proved for MARCs and MABRCs.Comment: Accepted to IEEE Transaction on Information Theor
Network Information Flow with Correlated Sources
In this paper, we consider a network communications problem in which multiple
correlated sources must be delivered to a single data collector node, over a
network of noisy independent point-to-point channels. We prove that perfect
reconstruction of all the sources at the sink is possible if and only if, for
all partitions of the network nodes into two subsets S and S^c such that the
sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c}
C_{ij}. Our main finding is that in this setup a general source/channel
separation theorem holds, and that Shannon information behaves as a classical
network flow, identical in nature to the flow of water in pipes. At first
glance, it might seem surprising that separation holds in a fairly general
network situation like the one we study. A closer look, however, reveals that
the reason for this is that our model allows only for independent
point-to-point channels between pairs of nodes, and not multiple-access and/or
broadcast channels, for which separation is well known not to hold. This
``information as flow'' view provides an algorithmic interpretation for our
results, among which perhaps the most important one is the optimality of
implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information
Theory -- contains (very) minor changes based on the last round of review
A Unified Approach for Network Information Theory
In this paper, we take a unified approach for network information theory and
prove a coding theorem, which can recover most of the achievability results in
network information theory that are based on random coding. The final
single-letter expression has a very simple form, which was made possible by
many novel elements such as a unified framework that represents various network
problems in a simple and unified way, a unified coding strategy that consists
of a few basic ingredients but can emulate many known coding techniques if
needed, and new proof techniques beyond the use of standard covering and
packing lemmas. For example, in our framework, sources, channels, states and
side information are treated in a unified way and various constraints such as
cost and distortion constraints are unified as a single joint-typicality
constraint.
Our theorem can be useful in proving many new achievability results easily
and in some cases gives simpler rate expressions than those obtained using
conventional approaches. Furthermore, our unified coding can strictly
outperform existing schemes. For example, we obtain a generalized
decode-compress-amplify-and-forward bound as a simple corollary of our main
theorem and show it strictly outperforms previously known coding schemes. Using
our unified framework, we formally define and characterize three types of
network duality based on channel input-output reversal and network flow
reversal combined with packing-covering duality.Comment: 52 pages, 7 figures, submitted to IEEE Transactions on Information
theory, a shorter version will appear in Proc. IEEE ISIT 201
- …