53 research outputs found

    Improved Modeling of the Correlation Between Continuous-Valued Sources in LDPC-Based DSC

    Full text link
    Accurate modeling of the correlation between the sources plays a crucial role in the efficiency of distributed source coding (DSC) systems. This correlation is commonly modeled in the binary domain by using a single binary symmetric channel (BSC), both for binary and continuous-valued sources. We show that "one" BSC cannot accurately capture the correlation between continuous-valued sources; a more accurate model requires "multiple" BSCs, as many as the number of bits used to represent each sample. We incorporate this new model into the DSC system that uses low-density parity-check (LDPC) codes for compression. The standard Slepian-Wolf LDPC decoder requires a slight modification so that the parameters of all BSCs are integrated in the log-likelihood ratios (LLRs). Further, using an interleaver the data belonging to different bit-planes are shuffled to introduce randomness in the binary domain. The new system has the same complexity and delay as the standard one. Simulation results prove the effectiveness of the proposed model and system.Comment: 5 Pages, 4 figures; presented at the Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, CA, November 201

    Source Coding with Side Information at the Decoder and Uncertain Knowledge of the Correlation

    No full text
    International audienceThis paper considers the problem of lossless source coding with side information at the decoder, when the correlation model between the source and the side information is uncertain. Four parametrized models representing the correlation between the source and the side information are introduced. The uncertainty on the correlation appears through the lack of knowledge on the value of the parameters. For each model, we propose a practical coding scheme based on non-binary Low Density Parity Check Codes and able to deal with the parameter uncertainty. At the encoder, the choice of the coding rate results from an information theoretical analysis. Then we propose decoding algorithms that jointly estimate the source vector and the parameters. As the proposed decoder is based on the Expectation-Maximization algorithm, which is very sensitive to initialization, we also propose a method to produce first a coarse estimate of the parameters

    Orthogonal Multiple Access with Correlated Sources: Feasible Region and Pragmatic Schemes

    Full text link
    In this paper, we consider orthogonal multiple access coding schemes, where correlated sources are encoded in a distributed fashion and transmitted, through additive white Gaussian noise (AWGN) channels, to an access point (AP). At the AP, component decoders, associated with the source encoders, iteratively exchange soft information by taking into account the source correlation. The first goal of this paper is to investigate the ultimate achievable performance limits in terms of a multi-dimensional feasible region in the space of channel parameters, deriving insights on the impact of the number of sources. The second goal is the design of pragmatic schemes, where the sources use "off-the-shelf" channel codes. In order to analyze the performance of given coding schemes, we propose an extrinsic information transfer (EXIT)-based approach, which allows to determine the corresponding multi-dimensional feasible regions. On the basis of the proposed analytical framework, the performance of pragmatic coded schemes, based on serially concatenated convolutional codes (SCCCs), is discussed

    Rate Compatible Protocol for Information Reconciliation: An application to QKD

    Get PDF
    Information Reconciliation is a mechanism that allows to weed out the discrepancies between two correlated variables. It is an essential component in every key agreement protocol where the key has to be transmitted through a noisy channel. The typical case is in the satellite scenario described by Maurer in the early 90's. Recently the need has arisen in relation with Quantum Key Distribution (QKD) protocols, where it is very important not to reveal unnecessary information in order to maximize the shared key length. In this paper we present an information reconciliation protocol based on a rate compatible construction of Low Density Parity Check codes. Our protocol improves the efficiency of the reconciliation for the whole range of error rates in the discrete variable QKD context. Its adaptability together with its low interactivity makes it specially well suited for QKD reconciliation

    Models, Statistics, and Rates of Binary Correlated Sources

    Full text link
    This paper discusses and analyzes various models of binary correlated sources, which may be relevant in several distributed communication scenarios. These models are statistically characterized in terms of joint Probability Mass Function (PMF) and covariance. Closed-form expressions for the joint entropy of the sources are also presented. The asymptotic entropy rate for very large number of sources is shown to converge to a common limit for all the considered models. This fact generalizes recent results on the information-theoretic performance limit of communication schemes which exploit the correlation among sources at the receiver.Comment: submitted for publicatio

    Source and channel coding using Fountain codes

    Get PDF
    The invention of Fountain codes is a major advance in the field of error correcting codes. The goal of this work is to study and develop algorithms for source and channel coding using a family of Fountain codes known as Raptor codes. From an asymptotic point of view, the best currently known sum-product decoding algorithm for non binary alphabets has a high complexity that limits its use in practice. For binary channels, sum-product decoding algorithms have been extensively studied and are known to perform well. In the first part of this work, we develop a decoding algorithm for binary codes on non-binary channels based on a combination of sum-product and maximum-likelihood decoding. We apply this algorithm to Raptor codes on both symmetric and non-symmetric channels. Our algorithm shows the best performance in terms of complexity and error rate per symbol for blocks of finite length for symmetric channels. Then, we examine the performance of Raptor codes under sum-product decoding when the transmission is taking place on piecewise stationary memoryless channels and on channels with memory corrupted by noise. We develop algorithms for joint estimation and detection while simultaneously employing expectation maximization to estimate the noise, and sum-product algorithm to correct errors. We also develop a hard decision algorithm for Raptor codes on piecewise stationary memoryless channels. Finally, we generalize our joint LT estimation-decoding algorithms for Markov-modulated channels. In the third part of this work, we develop compression algorithms using Raptor codes. More specifically we introduce a lossless text compression algorithm, obtaining in this way competitive results compared to the existing classical approaches. Moreover, we propose distributed source coding algorithms based on the paradigm proposed by Slepian and Wolf
    • 

    corecore