1,722 research outputs found
Slepian-Wolf Coding Over Cooperative Relay Networks
This paper deals with the problem of multicasting a set of discrete
memoryless correlated sources (DMCS) over a cooperative relay network.
Necessary conditions with cut-set interpretation are presented. A \emph{Joint
source-Wyner-Ziv encoding/sliding window decoding} scheme is proposed, in which
decoding at each receiver is done with respect to an ordered partition of other
nodes. For each ordered partition a set of feasibility constraints is derived.
Then, utilizing the sub-modular property of the entropy function and a novel
geometrical approach, the results of different ordered partitions are
consolidated, which lead to sufficient conditions for our problem. The proposed
scheme achieves operational separation between source coding and channel
coding. It is shown that sufficient conditions are indeed necessary conditions
in two special cooperative networks, namely, Aref network and finite-field
deterministic network. Also, in Gaussian cooperative networks, it is shown that
reliable transmission of all DMCS whose Slepian-Wolf region intersects the
cut-set bound region within a constant number of bits, is feasible. In
particular, all results of the paper are specialized to obtain an achievable
rate region for cooperative relay networks which includes relay networks and
two-way relay networks.Comment: IEEE Transactions on Information Theory, accepte
Cooperative Transmission for a Vector Gaussian Parallel Relay Network
In this paper, we consider a parallel relay network where two relays
cooperatively help a source transmit to a destination. We assume the source and
the destination nodes are equipped with multiple antennas. Three basic schemes
and their achievable rates are studied: Decode-and-Forward (DF),
Amplify-and-Forward (AF), and Compress-and-Forward (CF). For the DF scheme, the
source transmits two private signals, one for each relay, where dirty paper
coding (DPC) is used between the two private streams, and a common signal for
both relays. The relays make efficient use of the common information to
introduce a proper amount of correlation in the transmission to the
destination. We show that the DF scheme achieves the capacity under certain
conditions. We also show that the CF scheme is asymptotically optimal in the
high relay power limit, regardless of channel ranks. It turns out that the AF
scheme also achieves the asymptotic optimality but only when the
relays-to-destination channel is full rank. The relative advantages of the
three schemes are discussed with numerical results.Comment: 35 pages, 10 figures, submitted to IEEE Transactions on Information
Theor
Computation Over Gaussian Networks With Orthogonal Components
Function computation of arbitrarily correlated discrete sources over Gaussian
networks with orthogonal components is studied. Two classes of functions are
considered: the arithmetic sum function and the type function. The arithmetic
sum function in this paper is defined as a set of multiple weighted arithmetic
sums, which includes averaging of the sources and estimating each of the
sources as special cases. The type or frequency histogram function counts the
number of occurrences of each argument, which yields many important statistics
such as mean, variance, maximum, minimum, median, and so on. The proposed
computation coding first abstracts Gaussian networks into the corresponding
modulo sum multiple-access channels via nested lattice codes and linear network
coding and then computes the desired function by using linear Slepian-Wolf
source coding. For orthogonal Gaussian networks (with no broadcast and
multiple-access components), the computation capacity is characterized for a
class of networks. For Gaussian networks with multiple-access components (but
no broadcast), an approximate computation capacity is characterized for a class
of networks.Comment: 30 pages, 12 figures, submitted to IEEE Transactions on Information
Theor
Wireless Network Information Flow: A Deterministic Approach
In a wireless network with a single source and a single destination and an
arbitrary number of relay nodes, what is the maximum rate of information flow
achievable? We make progress on this long standing problem through a two-step
approach. First we propose a deterministic channel model which captures the key
wireless properties of signal strength, broadcast and superposition. We obtain
an exact characterization of the capacity of a network with nodes connected by
such deterministic channels. This result is a natural generalization of the
celebrated max-flow min-cut theorem for wired networks. Second, we use the
insights obtained from the deterministic analysis to design a new
quantize-map-and-forward scheme for Gaussian networks. In this scheme, each
relay quantizes the received signal at the noise level and maps it to a random
Gaussian codeword for forwarding, and the final destination decodes the
source's message based on the received signal. We show that, in contrast to
existing schemes, this scheme can achieve the cut-set upper bound to within a
gap which is independent of the channel parameters. In the case of the relay
channel with a single relay as well as the two-relay Gaussian diamond network,
the gap is 1 bit/s/Hz. Moreover, the scheme is universal in the sense that the
relays need no knowledge of the values of the channel parameters to
(approximately) achieve the rate supportable by the network. We also present
extensions of the results to multicast networks, half-duplex networks and
ergodic networks.Comment: To appear in IEEE transactions on Information Theory, Vol 57, No 4,
April 201
- …