138,243 research outputs found

    Quantum network communication -- the butterfly and beyond

    Full text link
    We study the k-pair communication problem for quantum information in networks of quantum channels. We consider the asymptotic rates of high fidelity quantum communication between specific sender-receiver pairs. Four scenarios of classical communication assistance (none, forward, backward, and two-way) are considered. (i) We obtain outer and inner bounds of the achievable rate regions in the most general directed networks. (ii) For two particular networks (including the butterfly network) routing is proved optimal, and the free assisting classical communication can at best be used to modify the directions of quantum channels in the network. Consequently, the achievable rate regions are given by counting edge avoiding paths, and precise achievable rate regions in all four assisting scenarios can be obtained. (iii) Optimality of routing can also be proved in classes of networks. The first class consists of directed unassisted networks in which (1) the receivers are information sinks, (2) the maximum distance from senders to receivers is small, and (3) a certain type of 4-cycles are absent, but without further constraints (such as on the number of communicating and intermediate parties). The second class consists of arbitrary backward-assisted networks with 2 sender-receiver pairs. (iv) Beyond the k-pair communication problem, observations are made on quantum multicasting and a static version of network communication related to the entanglement of assistance.Comment: 15 pages, 17 figures. Final versio

    Capacity of wireless erasure networks

    Get PDF
    In this paper, a special class of wireless networks, called wireless erasure networks, is considered. In these networks, each node is connected to a set of nodes by possibly correlated erasure channels. The network model incorporates the broadcast nature of the wireless environment by requiring each node to send the same signal on all outgoing channels. However, we assume there is no interference in reception. Such models are therefore appropriate for wireless networks where all information transmission is packetized and where some mechanism for interference avoidance is already built in. This paper looks at multicast problems over these networks. The capacity under the assumption that erasure locations on all the links of the network are provided to the destinations is obtained. It turns out that the capacity region has a nice max-flow min-cut interpretation. The definition of cut-capacity in these networks incorporates the broadcast property of the wireless medium. It is further shown that linear coding at nodes in the network suffices to achieve the capacity region. Finally, the performance of different coding schemes in these networks when no side information is available to the destinations is analyzed

    A Cut Principle for Information Flow

    Full text link
    We view a distributed system as a graph of active locations with unidirectional channels between them, through which they pass messages. In this context, the graph structure of a system constrains the propagation of information through it. Suppose a set of channels is a cut set between an information source and a potential sink. We prove that, if there is no disclosure from the source to the cut set, then there can be no disclosure to the sink. We introduce a new formalization of partial disclosure, called *blur operators*, and show that the same cut property is preserved for disclosure to within a blur operator. This cut-blur property also implies a compositional principle, which ensures limited disclosure for a class of systems that differ only beyond the cut.Comment: 31 page

    Normalized Entropy Vectors, Network Information Theory and Convex Optimization

    Get PDF
    We introduce the notion of normalized entropic vectors -- slightly different from the standard definition in the literature in that we normalize entropy by the logarithm of the alphabet size. We argue that this definition is more natural for determining the capacity region of networks and, in particular, that it smooths out the irregularities of the space of non-normalized entropy vectors and renders the closure of the resulting space convex (and compact). Furthermore, the closure of the space remains convex even under constraints imposed by memoryless channels internal to the network. It therefore follows that, for a large class of acyclic memoryless networks, the capacity region for an arbitrary set of sources and destinations can be found by maximization of a linear function over the convex set of channel-constrained normalized entropic vectors and some linear constraints. While this may not necessarily make the problem simpler, it certainly circumvents the "infinite-letter characterization" issue, as well as the nonconvexity of earlier formulations, and exposes the core of the problem. We show that the approach allows one to obtain the classical cutset bounds via a duality argument. Furthermore, the approach readily shows that, for acyclic memoryless wired networks, one need only consider the space of unconstrained normalized entropic vectors, thus separating channel and network coding -- a result very recently recognized in the literature

    Nested Lattice Codes for Gaussian Relay Networks with Interference

    Full text link
    In this paper, a class of relay networks is considered. We assume that, at a node, outgoing channels to its neighbors are orthogonal, while incoming signals from neighbors can interfere with each other. We are interested in the multicast capacity of these networks. As a subclass, we first focus on Gaussian relay networks with interference and find an achievable rate using a lattice coding scheme. It is shown that there is a constant gap between our achievable rate and the information theoretic cut-set bound. This is similar to the recent result by Avestimehr, Diggavi, and Tse, who showed such an approximate characterization of the capacity of general Gaussian relay networks. However, our achievability uses a structured code instead of a random one. Using the same idea used in the Gaussian case, we also consider linear finite-field symmetric networks with interference and characterize the capacity using a linear coding scheme.Comment: 23 pages, 5 figures, submitted to IEEE Transactions on Information Theor
    corecore