17 research outputs found

    Strong Coordination over Multi-hop Line Networks

    Full text link
    We analyze the problem of strong coordination over a multi-hop line network in which the node initiating the coordination is a terminal network node. We assume that each node has access to a certain amount of randomness that is local to the node, and that the nodes share some common randomness, which are used together with explicit hop-by-hop communication to achieve strong coordination. We derive the trade-offs among the required rates of communication on the network links, the rates of local randomness available to network nodes, and the rate of common randomness to realize strong coordination. We present an achievable coding scheme built using multiple layers of channel resolvability codes, and establish several settings in which this scheme is proven to offer the best possible trade-offs.Comment: 35 pages, 9 Figures, 4 Tables. A part of this work were published in the 2015 IEEE Information Theory Workshop, and a part was accepted for publication in the 50th Annual Conference on Information Sciences and System

    Strong Coordination over Noisy Channels: Is Separation Sufficient?

    Full text link
    We study the problem of strong coordination of actions of two agents XX and YY that communicate over a noisy communication channel such that the actions follow a given joint probability distribution. We propose two novel schemes for this noisy strong coordination problem, and derive inner bounds for the underlying strong coordination capacity region. The first scheme is a joint coordination-channel coding scheme that utilizes the randomness provided by the communication channel to reduce the local randomness required in generating the action sequence at agent YY. The second scheme exploits separate coordination and channel coding where local randomness is extracted from the channel after decoding. Finally, we present an example in which the joint scheme is able to outperform the separate scheme in terms of coordination rate.Comment: 9 pages, 4 figures. An extended version of a paper accepted for the IEEE International Symposium on Information Theory (ISIT), 201

    Strong Coordination over Noisy Channels: Is Separation Sufficient?

    Full text link
    We study the problem of strong coordination of actions of two agents XX and YY that communicate over a noisy communication channel such that the actions follow a given joint probability distribution. We propose two novel schemes for this noisy strong coordination problem, and derive inner bounds for the underlying strong coordination capacity region. The first scheme is a joint coordination-channel coding scheme that utilizes the randomness provided by the communication channel to reduce the local randomness required in generating the action sequence at agent YY. The second scheme exploits separate coordination and channel coding where local randomness is extracted from the channel after decoding. Finally, we present an example in which the joint scheme is able to outperform the separate scheme in terms of coordination rate.Comment: 9 pages, 4 figures. An extended version of a paper accepted for the IEEE International Symposium on Information Theory (ISIT), 201

    Empirical and Strong Coordination via Soft Covering with Polar Codes

    Full text link
    We design polar codes for empirical coordination and strong coordination in two-node networks. Our constructions hinge on the fact that polar codes enable explicit low-complexity schemes for soft covering. We leverage this property to propose explicit and low-complexity coding schemes that achieve the capacity regions of both empirical coordination and strong coordination for sequences of actions taking value in an alphabet of prime cardinality. Our results improve previously known polar coding schemes, which (i) were restricted to uniform distributions and to actions obtained via binary symmetric channels for strong coordination, (ii) required a non-negligible amount of common randomness for empirical coordination, and (iii) assumed that the simulation of discrete memoryless channels could be perfectly implemented. As a by-product of our results, we obtain a polar coding scheme that achieves channel resolvability for an arbitrary discrete memoryless channel whose input alphabet has prime cardinality.Comment: 14 pages, two-column, 5 figures, accepted to IEEE Transactions on Information Theor

    Source-channel coding for coordination over a noisy two-node network

    Get PDF
    Recently, the concept of coordinating actions between distributed agents has emerged in the information theory literature. It was first introduced by Cuff in 2008 for the point-to-point case of coordination. However, Cuff’s work and the vast majority of the follow-up research are based on establishing coordination over noise-free communication links. In contrast, this thesis investigates the open problem of coordination over noisy point-to-point links. The aim of this study is to examine Shannon’s source-channel separation theorem in the context of coordination. To that end, a general joint scheme to achieve the strong notion of coordination over a discrete memoryless channel is introduced. The strong coordination notion requires that the L1 distance between the induced joint distribution of action sequences selected by the nodes and a prescribed joint distribution vanishes exponentially fast with the sequence block length. From the general joint scheme, three special cases are constructed, one of which resembles Shannon’s separation scheme. As a surprising result, the proposed joint scheme has been found to be able to perform better than a strictly separate scheme. Finally, the last part of the thesis provides simulation results to confirm the presented argument based on comparing the achievable rate regions for the scheme resembling Shannon’s separation and a special case of the general joint scheme

    Secure Cascade Channel Synthesis

    Full text link
    We consider the problem of generating correlated random variables in a distributed fashion, where communication is constrained to a cascade network. The first node in the cascade observes an i.i.d. sequence XnX^n locally before initiating communication along the cascade. All nodes share bits of common randomness that are independent of XnX^n. We consider secure synthesis - random variables produced by the system appear to be appropriately correlated and i.i.d. even to an eavesdropper who is cognizant of the communication transmissions. We characterize the optimal tradeoff between the amount of common randomness used and the required rates of communication. We find that not only does common randomness help, its usage exceeds the communication rate requirements. The most efficient scheme is based on a superposition codebook, with the first node selecting messages for all downstream nodes. We also provide a fleeting view of related problems, demonstrating how the optimal rate region may shrink or expand.Comment: Submitted to IEEE Transactions on Information Theor
    corecore