2 research outputs found
Network Decoding Against Restricted Adversaries
We initiate the study of the one-shot capacity of communication (coded) networks with an adversary having access only to a proper subset of the network edges. We introduce the Diamond Network as a minimal example to show that known cut-set bounds are not sharp in general, and that their non-sharpness comes precisely from restricting the action of the adversary to a region of the network. We give a capacity-achieving scheme for the Diamond Network that implements an adversary detection strategy. We also show that linear network coding does not suffice in general to achieve capacity, proving a strong separation result between the one-shot capacity and its linear version. We then give a sufficient condition for tightness of the Singleton Cut-Set Bound in a family of two-level networks. Finally, we discuss how the presence of nodes that do not allow local encoding and decoding does or does not affect the one-shot capacity.</p
The Role of the Alphabet in Network Coding: An Optimization Approach
We consider the problem of determining the one-shot, zero-error capacity of a coded, multicast network over a small alphabet. We introduce a novel approach to this problem based on a mixed-integer program, which computes the size of the largest unambiguous codebook for a given alphabet size. As an application of our approach, we recover, extend and refine various results that were previously obtained with case-by-case analyses or specialized arguments, giving evidence of the wide applicability of our approach. We also provide two simple ideas that reduce the complexity of our method for some families of networks. We conclude the paper by outlining a research program we wish to pursue to investigate the one-shot capacity of large networks affected by adversarial noise and, more generally, the role played by the alphabet size in network coding