234 research outputs found
Network Information Flow with Correlated Sources
In this paper, we consider a network communications problem in which multiple
correlated sources must be delivered to a single data collector node, over a
network of noisy independent point-to-point channels. We prove that perfect
reconstruction of all the sources at the sink is possible if and only if, for
all partitions of the network nodes into two subsets S and S^c such that the
sink is always in S^c, we have that H(U_S|U_{S^c}) < \sum_{i\in S,j\in S^c}
C_{ij}. Our main finding is that in this setup a general source/channel
separation theorem holds, and that Shannon information behaves as a classical
network flow, identical in nature to the flow of water in pipes. At first
glance, it might seem surprising that separation holds in a fairly general
network situation like the one we study. A closer look, however, reveals that
the reason for this is that our model allows only for independent
point-to-point channels between pairs of nodes, and not multiple-access and/or
broadcast channels, for which separation is well known not to hold. This
``information as flow'' view provides an algorithmic interpretation for our
results, among which perhaps the most important one is the optimality of
implementing codes using a layered protocol stack.Comment: Final version, to appear in the IEEE Transactions on Information
Theory -- contains (very) minor changes based on the last round of review
Remote Source Coding under Gaussian Noise : Dueling Roles of Power and Entropy Power
The distributed remote source coding (so-called CEO) problem is studied in
the case where the underlying source, not necessarily Gaussian, has finite
differential entropy and the observation noise is Gaussian. The main result is
a new lower bound for the sum-rate-distortion function under arbitrary
distortion measures. When specialized to the case of mean-squared error, it is
shown that the bound exactly mirrors a corresponding upper bound, except that
the upper bound has the source power (variance) whereas the lower bound has the
source entropy power. Bounds exhibiting this pleasing duality of power and
entropy power have been well known for direct and centralized source coding
since Shannon's work. While the bounds hold generally, their value is most
pronounced when interpreted as a function of the number of agents in the CEO
problem
Sending a Bi-Variate Gaussian over a Gaussian MAC
We study the power versus distortion trade-off for the distributed
transmission of a memoryless bi-variate Gaussian source over a two-to-one
average-power limited Gaussian multiple-access channel. In this problem, each
of two separate transmitters observes a different component of a memoryless
bi-variate Gaussian source. The two transmitters then describe their source
component to a common receiver via an average-power constrained Gaussian
multiple-access channel. From the output of the multiple-access channel, the
receiver wishes to reconstruct each source component with the least possible
expected squared-error distortion. Our interest is in characterizing the
distortion pairs that are simultaneously achievable on the two source
components.
We present sufficient conditions and necessary conditions for the
achievability of a distortion pair. These conditions are expressed as a
function of the channel signal-to-noise ratio (SNR) and of the source
correlation. In several cases the necessary conditions and sufficient
conditions are shown to agree. In particular, we show that if the channel SNR
is below a certain threshold, then an uncoded transmission scheme is optimal.
We also derive the precise high-SNR asymptotics of an optimal scheme.Comment: submitted to the IEEE Transactions on Information Theor
Competitive Privacy in the Smart Grid: An Information-theoretic Approach
Advances in sensing and communication capabilities as well as power industry
deregulation are driving the need for distributed state estimation in the smart
grid at the level of the regional transmission organizations (RTOs). This leads
to a new competitive privacy problem amongst the RTOs since there is a tension
between sharing data to ensure network reliability (utility/benefit to all
RTOs) and withholding data for profitability and privacy reasons. The resulting
tradeoff between utility, quantified via fidelity of its state estimate at each
RTO, and privacy, quantified via the leakage of the state of one RTO at other
RTOs, is captured precisely using a lossy source coding problem formulation for
a two RTO network. For a two-RTO model, it is shown that the set of all
feasible utility-privacy pairs can be achieved via a single round of
communication when each RTO communicates taking into account the correlation
between the measured data at both RTOs. The lossy source coding problem and
solution developed here is also of independent interest.Comment: Accepted for publication and presentation at the IEEE SmartGridComm
201
Lecture Notes on Network Information Theory
These lecture notes have been converted to a book titled Network Information
Theory published recently by Cambridge University Press. This book provides a
significantly expanded exposition of the material in the lecture notes as well
as problems and bibliographic notes at the end of each chapter. The authors are
currently preparing a set of slides based on the book that will be posted in
the second half of 2012. More information about the book can be found at
http://www.cambridge.org/9781107008731/. The previous (and obsolete) version of
the lecture notes can be found at http://arxiv.org/abs/1001.3404v4/
- …