7,628 research outputs found

    Generalized Cut-Set Bounds for Broadcast Networks

    Full text link
    A broadcast network is a classical network with all source messages collocated at a single source node. For broadcast networks, the standard cut-set bounds, which are known to be loose in general, are closely related to union as a specific set operation to combine the basic cuts of the network. This paper provides a new set of network coding bounds for general broadcast networks. These bounds combine the basic cuts of the network via a variety of set operations (not just the union) and are established via only the submodularity of Shannon entropy. The tightness of these bounds are demonstrated via applications to combination networks.Comment: 30 pages, 4 figures, submitted to the IEEE Transaction on Information Theor

    On the Fundamental Limits and Symmetric Designs for Distributed Information Systems

    Get PDF
    Many multi-terminal communication networks, content delivery networks, cache networks, and distributed storage systems can be modeled as a broadcast network. An explicit characterization of the capacity region of the general network coding problem is one of the best known open problems in network information theory. A simple set of bounds that are often used in the literature to show that certain rate tuples are infeasible are based on the graph-theoretic notion of cut. The standard cut-set bounds, however, are known to be loose in general when there are multiple messages to be communicated in the network. This dissertation focuses on broadcast networks, for which the standard cut-set bounds are closely related to union as a specific set operation to combine different simple cuts of the network. A new set of explicit network coding bounds, which combine different simple cuts of the network via a variety of set operations (not just the union), are established via their connections to extremal inequalities for submodular functions. The tightness of these bounds are demonstrated via applications to combination networks. The tightness of generalized cut-set bounds has been further explored by studying the problem of “latency capacity region” for a broadcast channel. An implicit characterization of this region has been proved by Tian, where a rate splitting based scheme was shown to be optimal. However, the explicit characterization of this region was only available when the number of receivers are less than three. In this dissertation, a precise polyhedral description of this region for a symmetric broadcast channel with complete message set and arbitrary number of users has been established. It has been shown that a set of generalized cut-set bounds, characterizes the entire symmetrical multicast region. The achievability part is proved by showing that every maximum rate vector is feasible by using a successive encoding scheme. The framework for achievability strongly relies on polyhedral combinatorics and it can be useful in network information theory problems when a polyhedral description of a region is needed. Moreover, it is known that there is a direct relationship between network coding solution and characterization of entropy region. This dissertation, also studies the symmetric structures in network coding problems and their relation with symmetrical projections of entropy region and introduces new aspects of entropy inequalities. First, inequalities relating average joint entropies rather than entropies over individual subsets are studied. Second, the existence of non-Shannon type inequalities under partial symmetry is studied using the concepts of Shannon and non-Shannon groups. Finally, due to the relationship between linear entropic vectors and representability of integer polymatroids, construction of such vector has been discussed. Specifically, It is shown that representability of the particularly constructed matroid is a sufficient condition for integer polymatroids to be linearly representable over real numbers. Furthermore, it has been shown that any real-valued submodular function (such as Shannon entropy) can be approximated (arbitrarily close) by an integer polymatroid

    On Approximating the Sum-Rate for Multiple-Unicasts

    Full text link
    We study upper bounds on the sum-rate of multiple-unicasts. We approximate the Generalized Network Sharing Bound (GNS cut) of the multiple-unicasts network coding problem with kk independent sources. Our approximation algorithm runs in polynomial time and yields an upper bound on the joint source entropy rate, which is within an O(log2k)O(\log^2 k) factor from the GNS cut. It further yields a vector-linear network code that achieves joint source entropy rate within an O(log2k)O(\log^2 k) factor from the GNS cut, but \emph{not} with independent sources: the code induces a correlation pattern among the sources. Our second contribution is establishing a separation result for vector-linear network codes: for any given field F\mathbb{F} there exist networks for which the optimum sum-rate supported by vector-linear codes over F\mathbb{F} for independent sources can be multiplicatively separated by a factor of k1δk^{1-\delta}, for any constant δ>0{\delta>0}, from the optimum joint entropy rate supported by a code that allows correlation between sources. Finally, we establish a similar separation result for the asymmetric optimum vector-linear sum-rates achieved over two distinct fields Fp\mathbb{F}_{p} and Fq\mathbb{F}_{q} for independent sources, revealing that the choice of field can heavily impact the performance of a linear network code.Comment: 10 pages; Shorter version appeared at ISIT (International Symposium on Information Theory) 2015; some typos correcte

    The Approximate Capacity of the Gaussian N-Relay Diamond Network

    Full text link
    We consider the Gaussian "diamond" or parallel relay network, in which a source node transmits a message to a destination node with the help of N relays. Even for the symmetric setting, in which the channel gains to the relays are identical and the channel gains from the relays are identical, the capacity of this channel is unknown in general. The best known capacity approximation is up to an additive gap of order N bits and up to a multiplicative gap of order N^2, with both gaps independent of the channel gains. In this paper, we approximate the capacity of the symmetric Gaussian N-relay diamond network up to an additive gap of 1.8 bits and up to a multiplicative gap of a factor 14. Both gaps are independent of the channel gains and, unlike the best previously known result, are also independent of the number of relays N in the network. Achievability is based on bursty amplify-and-forward, showing that this simple scheme is uniformly approximately optimal, both in the low-rate as well as in the high-rate regimes. The upper bound on capacity is based on a careful evaluation of the cut-set bound. We also present approximation results for the asymmetric Gaussian N-relay diamond network. In particular, we show that bursty amplify-and-forward combined with optimal relay selection achieves a rate within a factor O(log^4(N)) of capacity with pre-constant in the order notation independent of the channel gains.Comment: 23 pages, to appear in IEEE Transactions on Information Theor

    Capacity of wireless erasure networks

    Get PDF
    In this paper, a special class of wireless networks, called wireless erasure networks, is considered. In these networks, each node is connected to a set of nodes by possibly correlated erasure channels. The network model incorporates the broadcast nature of the wireless environment by requiring each node to send the same signal on all outgoing channels. However, we assume there is no interference in reception. Such models are therefore appropriate for wireless networks where all information transmission is packetized and where some mechanism for interference avoidance is already built in. This paper looks at multicast problems over these networks. The capacity under the assumption that erasure locations on all the links of the network are provided to the destinations is obtained. It turns out that the capacity region has a nice max-flow min-cut interpretation. The definition of cut-capacity in these networks incorporates the broadcast property of the wireless medium. It is further shown that linear coding at nodes in the network suffices to achieve the capacity region. Finally, the performance of different coding schemes in these networks when no side information is available to the destinations is analyzed
    corecore