43 research outputs found

    Effects of the Generation Size and Overlap on Throughput and Complexity in Randomized Linear Network Coding

    Full text link
    To reduce computational complexity and delay in randomized network coded content distribution, and for some other practical reasons, coding is not performed simultaneously over all content blocks, but over much smaller, possibly overlapping subsets of these blocks, known as generations. A penalty of this strategy is throughput reduction. To analyze the throughput loss, we model coding over generations with random generation scheduling as a coupon collector's brotherhood problem. This model enables us to derive the expected number of coded packets needed for successful decoding of the entire content as well as the probability of decoding failure (the latter only when generations do not overlap) and further, to quantify the tradeoff between computational complexity and throughput. Interestingly, with a moderate increase in the generation size, throughput quickly approaches link capacity. Overlaps between generations can further improve throughput substantially for relatively small generation sizes.Comment: To appear in IEEE Transactions on Information Theory Special Issue: Facets of Coding Theory: From Algorithms to Networks, Feb 201

    Doped Fountain Coding for Minimum Delay Data Collection in Circular Networks

    Full text link
    This paper studies decentralized, Fountain and network-coding based strategies for facilitating data collection in circular wireless sensor networks, which rely on the stochastic diversity of data storage. The goal is to allow for a reduced delay collection by a data collector who accesses the network at a random position and random time. Data dissemination is performed by a set of relays which form a circular route to exchange source packets. The storage nodes within the transmission range of the route's relays linearly combine and store overheard relay transmissions using random decentralized strategies. An intelligent data collector first collects a minimum set of coded packets from a subset of storage nodes in its proximity, which might be sufficient for recovering the original packets and, by using a message-passing decoder, attempts recovering all original source packets from this set. Whenever the decoder stalls, the source packet which restarts decoding is polled/doped from its original source node. The random-walk-based analysis of the decoding/doping process furnishes the collection delay analysis with a prediction on the number of required doped packets. The number of doped packets can be surprisingly small when employed with an Ideal Soliton code degree distribution and, hence, the doping strategy may have the least collection delay when the density of source nodes is sufficiently large. Furthermore, we demonstrate that network coding makes dissemination more efficient at the expense of a larger collection delay. Not surprisingly, a circular network allows for a significantly more (analytically and otherwise) tractable strategies relative to a network whose model is a random geometric graph
    corecore