520 research outputs found

    Expander Chunked Codes

    Full text link
    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks, and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance,where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 percent of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.Comment: 26 pages, 3 figures, submitted for journal publicatio

    In-Order Delivery Delay of Transport Layer Coding

    Full text link
    A large number of streaming applications use reliable transport protocols such as TCP to deliver content over the Internet. However, head-of-line blocking due to packet loss recovery can often result in unwanted behavior and poor application layer performance. Transport layer coding can help mitigate this issue by helping to recover from lost packets without waiting for retransmissions. We consider the use of an on-line network code that inserts coded packets at strategic locations within the underlying packet stream. If retransmissions are necessary, additional coding packets are transmitted to ensure the receiver's ability to decode. An analysis of this scheme is provided that helps determine both the expected in-order packet delivery delay and its variance. Numerical results are then used to determine when and how many coded packets should be inserted into the packet stream, in addition to determining the trade-offs between reducing the in-order delay and the achievable rate. The analytical results are finally compared with experimental results to provide insight into how to minimize the delay of existing transport layer protocols

    Sparse Network Coding with Overlapping Classes

    Full text link
    This paper presents a novel approach to network coding for distribution of large files. Instead of the usual approach of splitting packets into disjoint classes (also known as generations) we propose the use of overlapping classes. The overlapping allows the decoder to alternate between Gaussian elimination and back substitution, simultaneously boosting the performance and reducing the decoding complexity. Our approach can be seen as a combination of fountain coding and network coding. Simulation results are presented that demonstrate the promise of our approach.Comment: 15 pages, 5 figures, to be published at NetCod 200

    On Tunable Sparse Network Coding in Commercial Devices for Networks and Filesystems

    Get PDF

    Effects of the Generation Size and Overlap on Throughput and Complexity in Randomized Linear Network Coding

    Full text link
    To reduce computational complexity and delay in randomized network coded content distribution, and for some other practical reasons, coding is not performed simultaneously over all content blocks, but over much smaller, possibly overlapping subsets of these blocks, known as generations. A penalty of this strategy is throughput reduction. To analyze the throughput loss, we model coding over generations with random generation scheduling as a coupon collector's brotherhood problem. This model enables us to derive the expected number of coded packets needed for successful decoding of the entire content as well as the probability of decoding failure (the latter only when generations do not overlap) and further, to quantify the tradeoff between computational complexity and throughput. Interestingly, with a moderate increase in the generation size, throughput quickly approaches link capacity. Overlaps between generations can further improve throughput substantially for relatively small generation sizes.Comment: To appear in IEEE Transactions on Information Theory Special Issue: Facets of Coding Theory: From Algorithms to Networks, Feb 201
    corecore