3,580 research outputs found
Capacity of Sum-networks for Different Message Alphabets
A sum-network is a directed acyclic network in which all terminal nodes
demand the `sum' of the independent information observed at the source nodes.
Many characteristics of the well-studied multiple-unicast network communication
problem also hold for sum-networks due to a known reduction between instances
of these two problems. Our main result is that unlike a multiple unicast
network, the coding capacity of a sum-network is dependent on the message
alphabet. We demonstrate this using a construction procedure and show that the
choice of a message alphabet can reduce the coding capacity of a sum-network
from to close to
Network coding for non-uniform demands
Non-uniform demand networks are defined as a useful connection model, in between multicasts and general connections. In these networks, each sink demands a certain number of messages, without specifying their identities. We study the solvability of such networks and give a tight bound on the number of sinks for which the min cut condition is sufficient. This sufficiency result is unique to the non-uniform demand model and does not apply to general connection networks. We propose constructions to solve networks at, or slightly below capacity, and investigate the effect large alphabets have on the solvability of such networks. We also show that our efficient constructions are suboptimal when used in networks with more sinks, yet this comes with little surprise considering the fact that the general problem is shown to be NP-hard
"Graph Entropy, Network Coding and Guessing games"
We introduce the (private) entropy of a directed graph (in a new network coding sense) as well as a number of related concepts. We show that the entropy of a directed graph is identical to its guessing number and can be bounded from below with the number of vertices minus the size of the graph’s shortest index code. We show that the Network Coding solvability of each specific multiple unicast network is completely determined by the entropy (as well as by the shortest index code) of the directed graph that occur by identifying each source node with each corresponding target node. Shannon’s information inequalities can be used to calculate up- per bounds on a graph’s entropy as well as calculating the size of the minimal index code. Recently, a number of new families of so-called non-shannon-type information inequalities have been discovered. It has been shown that there exist communication networks with a ca- pacity strictly ess than required for solvability, but where this fact cannot be derived using Shannon’s classical information inequalities. Based on this result we show that there exist graphs with an entropy that cannot be calculated using only Shannon’s classical information inequalities, and show that better estimate can be obtained by use of certain non-shannon-type information inequalities
On Discrete Alphabets for the Two-user Gaussian Interference Channel with One Receiver Lacking Knowledge of the Interfering Codebook
In multi-user information theory it is often assumed that every node in the
network possesses all codebooks used in the network. This assumption is however
impractical in distributed ad-hoc and cognitive networks. This work considers
the two- user Gaussian Interference Channel with one Oblivious Receiver
(G-IC-OR), i.e., one receiver lacks knowledge of the interfering cookbook while
the other receiver knows both codebooks. We ask whether, and if so how much,
the channel capacity of the G-IC- OR is reduced compared to that of the
classical G-IC where both receivers know all codebooks. Intuitively, the
oblivious receiver should not be able to jointly decode its intended message
along with the unintended interfering message whose codebook is unavailable. We
demonstrate that in strong and very strong interference, where joint decoding
is capacity achieving for the classical G-IC, lack of codebook knowledge does
not reduce performance in terms of generalized degrees of freedom (gDoF).
Moreover, we show that the sum-capacity of the symmetric G-IC- OR is to within
O(log(log(SNR))) of that of the classical G-IC. The key novelty of the proposed
achievable scheme is the use of a discrete input alphabet for the non-oblivious
transmitter, whose cardinality is appropriately chosen as a function of SNR
- …