45,534 research outputs found

    Jointly optimised iterative source-coding, channel-coding and modulation for transmission over wireless channels

    No full text
    Joint source-coding, channel-coding and modulation schemes based on Variable Length Codes (VLCs), Trellis Coded Modulation (TCM), Turbo TCM (TTCM), Bit-Interleaved Coded Modulation (BICM) and iteratively decoded BICM (BICM-ID) schemes are proposed. A significant coding gain is achieved without bandwidth expansion, when exchanging information between the VLC and the coded modulation decoders with the advent of iterative decoding. With the aid of using independent interleavers for the In-phase and Quadrature phase components of the complex-valued constellation, further diversity gain may be achieved. The performance of the proposed schemes is evaluated over both AWGN and Rayleigh fading channels. Explicitly, at BER = 10-5 most of the proposed schemes have BER curves less than one-dB away from the channel capacity limit

    Exponential Golomb and Rice Error Correction codes for generalized near-capacity joint source and channel coding

    No full text
    The recently proposed Unary Error Correction (UEC) and Elias Gamma Error Correction (EGEC) codes facilitate the near-capacity Joint Source and Channel Coding (JSCC) of symbol values selected from large alphabets at a low complexity. Despite their large alphabet, these codes were only designed for a limited range of symbol value probability distributions. In this paper, we generalize the family of UEC and EGEC codes to the class of Rice and Exponential Golomb (ExpG) Error Correction (RiceEC and ExpGEC) codes, which have a much wider applicability, including the symbols produced by the H.265 video codec, the letters of the English alphabet and in fact any arbitrary monotonic unbounded source distributions. Furthermore, the practicality of the proposed codes is enhanced to allow a continuous stream of symbol values to be encoded and decoded using only fixed-length system components. We explore the parameter space to offer beneficial trade-offs between error correction capability, decoding complexity, as well as transmission-energy, -duration and -bandwidth over a wide range of operating conditions. In each case, we show that our codes offer significant performance improvements over the best of several state-of-the-art benchmarkers. In particular, our codes achieve the same error correction capability, as well as transmissionenergy, -duration and -bandwidth as a Variable Length Error- Correction (VLEC) code benchmarker, while reducing the decoding complexity by an order of magnitude. In comparison with the best of the other JSCC and Separate Source and Channel Coding (SSCC) benchmarkers, our codes consistently offer E_b/N_0 gains of between 0.5 dB and 1.0 dB which only appear to be modest, because the system operates close to capacity. These improvements are achieved for free, since they are not achieved at the cost of increasing transmission-energy, -duration, -bandwidth or decoding complexity

    The Reliability Function of Lossy Source-Channel Coding of Variable-Length Codes with Feedback

    Full text link
    We consider transmission of discrete memoryless sources (DMSes) across discrete memoryless channels (DMCs) using variable-length lossy source-channel codes with feedback. The reliability function (optimum error exponent) is shown to be equal to max{0,B(1R(D)/C)},\max\{0, B(1-R(D)/C)\}, where R(D)R(D) is the rate-distortion function of the source, BB is the maximum relative entropy between output distributions of the DMC, and CC is the Shannon capacity of the channel. We show that, in this setting and in this asymptotic regime, separate source-channel coding is, in fact, optimal.Comment: Accepted to IEEE Transactions on Information Theory in Apr. 201

    Joint source-channel coding with feedback

    Get PDF
    This paper quantifies the fundamental limits of variable-length transmission of a general (possibly analog) source over a memoryless channel with noiseless feedback, under a distortion constraint. We consider excess distortion, average distortion and guaranteed distortion (dd-semifaithful codes). In contrast to the asymptotic fundamental limit, a general conclusion is that allowing variable-length codes and feedback leads to a sizable improvement in the fundamental delay-distortion tradeoff. In addition, we investigate the minimum energy required to reproduce kk source samples with a given fidelity after transmission over a memoryless Gaussian channel, and we show that the required minimum energy is reduced with feedback and an average (rather than maximal) power constraint.Comment: To appear in IEEE Transactions on Information Theor

    A unary error correction code for the near-capacity joint source and channel coding of symbol values from an infinite set

    No full text
    A novel Joint Source and Channel Code (JSCC) is proposed, which we refer to as the Unary Error Correction (UEC) code. Unlike existing JSCCs, our UEC facilitates the practical encoding of symbol values that are selected from a set having an infinite cardinality. Conventionally, these symbols are conveyed using Separate Source and Channel Codes (SSCCs), but we demonstrate that the residual redundancy that is retained following source coding results in a capacity loss, which is found to have a value of 1.11 dB in a particular practical scenario. By contrast, the proposed UEC code can eliminate this capacity loss, or reduce it to an infinitesimally small value. Furthermore, the UEC code has only a moderate complexity, facilitating its employment in practical low-complexity applications
    corecore