2 research outputs found

    On Approximating the Sum-Rate for Multiple-Unicasts

    Full text link
    We study upper bounds on the sum-rate of multiple-unicasts. We approximate the Generalized Network Sharing Bound (GNS cut) of the multiple-unicasts network coding problem with kk independent sources. Our approximation algorithm runs in polynomial time and yields an upper bound on the joint source entropy rate, which is within an O(log2k)O(\log^2 k) factor from the GNS cut. It further yields a vector-linear network code that achieves joint source entropy rate within an O(log2k)O(\log^2 k) factor from the GNS cut, but \emph{not} with independent sources: the code induces a correlation pattern among the sources. Our second contribution is establishing a separation result for vector-linear network codes: for any given field F\mathbb{F} there exist networks for which the optimum sum-rate supported by vector-linear codes over F\mathbb{F} for independent sources can be multiplicatively separated by a factor of k1δk^{1-\delta}, for any constant δ>0{\delta>0}, from the optimum joint entropy rate supported by a code that allows correlation between sources. Finally, we establish a similar separation result for the asymmetric optimum vector-linear sum-rates achieved over two distinct fields Fp\mathbb{F}_{p} and Fq\mathbb{F}_{q} for independent sources, revealing that the choice of field can heavily impact the performance of a linear network code.Comment: 10 pages; Shorter version appeared at ISIT (International Symposium on Information Theory) 2015; some typos correcte

    A study of some problems in network information theory

    No full text
    Shannon theory has been very successful in studying fundamental limits of communication in the classical setting, where one sender wishes to communicate a message to one receiver over an unreliable medium. The theory has also been successful in studying networks of small to moderate sizes, with multiple senders and multiple receivers. However, it has become well-known recently that understanding the fundamental limits of communication in a general network is a hard problem on numerous accounts.In this dissertation, we suggest that a significant aspect of the difficulty in studying limits of communication over networks lies in the unidirectional aspect of the problem. Under different assumptions that rid the problem of this particular aspect by introducing a suitable symmetry, either in the underlying network or in the traffic model, we find that simple schemes are approximately optimal in achieving these fundamental limits. We demonstrate this as a meta-theorem in the class of wireline networks and Gaussian networks. The key contribution driving these results is a new outer bound that we call the Generalized Network Sharing bound.We also study a problem of simulation of joint distributions in a non-interactive setup. Two agents observe correlated random variables and wish to simulate a certain joint distribution. We consider a non-asymptotic formulation of this problem and study tools that help prove impossibility results. We also study connections of this problem to existing problems in the literature
    corecore