50,064 research outputs found

    Improved Decoding of Staircase Codes: The Soft-aided Bit-marking (SABM) Algorithm

    Get PDF
    Staircase codes (SCCs) are typically decoded using iterative bounded-distance decoding (BDD) and hard decisions. In this paper, a novel decoding algorithm is proposed, which partially uses soft information from the channel. The proposed algorithm is based on marking certain number of highly reliable and highly unreliable bits. These marked bits are used to improve the miscorrection-detection capability of the SCC decoder and the error-correcting capability of BDD. For SCCs with 22-error-correcting Bose-Chaudhuri-Hocquenghem component codes, our algorithm improves upon standard SCC decoding by up to 0.300.30~dB at a bit-error rate (BER) of 10710^{-7}. The proposed algorithm is shown to achieve almost half of the gain achievable by an idealized decoder with this structure. A complexity analysis based on the number of additional calls to the component BDD decoder shows that the relative complexity increase is only around 4%4\% at a BER of 10410^{-4}. This additional complexity is shown to decrease as the channel quality improves. Our algorithm is also extended (with minor modifications) to product codes. The simulation results show that in this case, the algorithm offers gains of up to 0.440.44~dB at a BER of 10810^{-8}.Comment: 10 pages, 12 figure

    On Achievable Rates for Long-Haul Fiber-Optic Communications

    Get PDF
    Lower bounds on mutual information (MI) of long-haul optical fiber systems for hard-decision and soft-decision decoding are studied. Ready-to-use expressions to calculate the MI are presented. Extensive numerical simulations are used to quantify how changes in the optical transmitter, receiver, and channel affect the achievable transmission rates of the system. Special emphasis is put to the use of different quadrature amplitude modulation formats, channel spacings, digital back-propagation schemes and probabilistic shaping. The advantages of using MI over the prevailing QQ-factor as a figure of merit of coded optical systems are also highlighted.Comment: Hard decision mutual information analysis added, two typos correcte

    Robust streaming in delay tolerant networks

    Get PDF
    Delay Tolerant Networks (DTN) do not provide any end to end connectivity guarantee. Thus, transporting data over such networks is a tough challenge as most of Internet applications assume a form of persistent end to end connection. While research in DTN has mainly addressed the problem of routing in various mobility contexts with the aim to improve bundle delay delivery and data delivery ratio, little attention has been paid to applications. This paper investigates the support of streaming-like applications over DTN. We identify how DTN characteristics impact on the overall performances of these applications and present Tetrys, a transport layer mechanism, which enables robust streaming over DTN. Tetrys is based on an on the fly coding mechanism able to ensure full reliability without retransmission and fast in-order bundle delivery in comparison to classical erasure coding schemes. We evaluate our Tetrys prototype on real DTN connectivity traces captured from the Rollerblading tour in Paris. Simulations show that on average, Tetrys clearly outperforms all other reliability schemes in terms of bundles delivery service

    An Adaptive Entanglement Distillation Scheme Using Quantum Low Density Parity Check Codes

    Full text link
    Quantum low density parity check (QLDPC) codes are useful primitives for quantum information processing because they can be encoded and decoded efficiently. Besides, the error correcting capability of a few QLDPC codes exceeds the quantum Gilbert-Varshamov bound. Here, we report a numerical performance analysis of an adaptive entanglement distillation scheme using QLDPC codes. In particular, we find that the expected yield of our adaptive distillation scheme to combat depolarization errors exceed that of Leung and Shor whenever the error probability is less than about 0.07 or greater than about 0.28. This finding illustrates the effectiveness of using QLDPC codes in entanglement distillation.Comment: 12 pages, 6 figure

    HARQ Buffer Management: An Information-Theoretic View

    Full text link
    A key practical constraint on the design of Hybrid automatic repeat request (HARQ) schemes is the size of the on-chip buffer that is available at the receiver to store previously received packets. In fact, in modern wireless standards such as LTE and LTE-A, the HARQ buffer size is one of the main drivers of the modem area and power consumption. This has recently highlighted the importance of HARQ buffer management, that is, of the use of buffer-aware transmission schemes and of advanced compression policies for the storage of received data. This work investigates HARQ buffer management by leveraging information-theoretic achievability arguments based on random coding. Specifically, standard HARQ schemes, namely Type-I, Chase Combining and Incremental Redundancy, are first studied under the assumption of a finite-capacity HARQ buffer by considering both coded modulation, via Gaussian signaling, and Bit Interleaved Coded Modulation (BICM). The analysis sheds light on the impact of different compression strategies, namely the conventional compression log-likelihood ratios and the direct digitization of baseband signals, on the throughput. Then, coding strategies based on layered modulation and optimized coding blocklength are investigated, highlighting the benefits of HARQ buffer-aware transmission schemes. The optimization of baseband compression for multiple-antenna links is also studied, demonstrating the optimality of a transform coding approach.Comment: submitted to IEEE International Symposium on Information Theory (ISIT) 2015. 29 pages, 12 figures, submitted to journal publicatio
    corecore