29 research outputs found

    Analysis of the Second Moment of the LT Decoder

    Full text link
    We analyze the second moment of the ripple size during the LT decoding process and prove that the standard deviation of the ripple size for an LT-code with length kk is of the order of k.\sqrt k. Together with a result by Karp et. al stating that the expectation of the ripple size is of the order of kk [3], this gives bounds on the error probability of the LT decoder. We also give an analytic expression for the variance of the ripple size up to terms of constant order, and refine the expression in [3] for the expectation of the ripple size up to terms of the order of 1/k1/k, thus providing a first step towards an analytic finite-length analysis of LT decoding.Comment: 5 pages, 1 figure; submitted to ISIT 200

    Caching at the Edge with LT codes

    Get PDF
    We study the performance of caching schemes based on LT under peeling (iterative) decoding algorithm. We assume that users ask for downloading content to multiple cache-aided transmitters. Transmitters are connected through a backhaul link to a master node while no direct link exists between users and the master node. Each content is fragmented and coded with LT code. Cache placement at each transmitter is optimized such that transmissions over the backhaul link is minimized. We derive a closed form expression for the calculation of the backhaul transmission rate. We compare the performance of a caching scheme based on LT with respect to a caching scheme based on maximum distance separable codes. Finally, we show that caching with \acl{LT} codes behave as good as caching with maximum distance separable codes

    Fountain coding with decoder side information

    Get PDF
    In this contribution, we consider the application of Digital Fountain (DF) codes to the problem of data transmission when side information is available at the decoder. The side information is modelled as a "virtual" channel output when original information sequence is the input. For two cases of the system model, which model both the virtual and the actual transmission channel either as a binary erasure channel or as a binary input additive white Gaussian noise (BIAWGN) channel, we propose methods of enhancing the design of standard non-systematic DF codes by optimizing their output degree distribution based oil the side information assumption. In addition, a systematic Raptor design has been employed as a possible solution to the problem

    Frameless ALOHA with Reliability-Latency Guarantees

    Get PDF
    One of the novelties brought by 5G is that wireless system design has increasingly turned its focus on guaranteeing reliability and latency. This shifts the design objective of random access protocols from throughput optimization towards constraints based on reliability and latency. For this purpose, we use frameless ALOHA, which relies on successive interference cancellation (SIC), and derive its exact finite-length analysis of the statistics of the unresolved users (reliability) as a function of the contention period length (latency). The presented analysis can be used to derive the reliability-latency guarantees. We also optimize the scheme parameters in order to maximize the reliability within a given latency. Our approach represents an important step towards the general area of design and analysis of access protocols with reliability-latency guarantees.Comment: Accepted for presentation at IEEE Globecom 201
    corecore