6 research outputs found

    Distributed Source–Channel Coding Using Reduced-Complexity Syndrome-Based TTCM

    Full text link

    Quantum error correction protects quantum search algorithms against decoherence

    No full text
    When quantum computing becomes a wide-spread commercial reality, Quantum Search Algorithms (QSA) and especially Grover’s QSA will inevitably be one of their main applications, constituting their cornerstone. Most of the literature assumes that the quantum circuits are free from decoherence. Practically, decoherence will remain unavoidable as is the Gaussian noise of classic circuits imposed by the Brownian motion of electrons, hence it may have to be mitigated. In this contribution, we investigate the effect of quantum noise on the performance of QSAs, in terms of their success probability as a function of the database size to be searched, when decoherence is modelled by depolarizing channels’ deleterious effects imposed on the quantum gates. Moreover, we employ quantum error correction codes for limiting the effects of quantum noise and for correcting quantum flips. More specifically, we demonstrate that, when we search for a single solution in a database having 4096 entries using Grover’s QSA at an aggressive depolarizing probability of 10-3, the success probability of the search is 0.22 when no quantum coding is used, which is improved to 0.96 when Steane’s quantum error correction code is employed. Finally, apart from Steane’s code, the employment of Quantum Bose-Chaudhuri-Hocquenghem (QBCH) codes is also considered

    Fully-parallel quantum turbo decoder

    No full text
    Quantum Turbo Codes (QTCs) are known to operate close to the achievable Hashing bound. However, the sequential nature of the conventional quantum turbo decoding algorithm imposes a high decoding latency, which increases linearly with the frame length. This posses a potential threat to quantum systems having short coherence times. In this context, we conceive a Fully- Parallel Quantum Turbo Decoder (FPQTD), which eliminates the inherent time dependencies of the conventional decoder by executing all the associated processes concurrently. Due to its parallel nature, the proposed FPQTD reduces the decoding times by several orders of magnitude, while maintaining the same performance. We have also demonstrated the significance of employing an odd-even interleaver design in conjunction with the proposed FPQTD. More specifically, it is shown that an odd-even interleaver reduces the computational complexity by 50%, without compromising the achievable performance

    The Road From Classical to Quantum Codes: A Hashing Bound Approaching Design Procedure

    Full text link
    Powerful Quantum Error Correction Codes (QECCs) are required for stabilizing and protecting fragile qubits against the undesirable effects of quantum decoherence. Similar to classical codes, hashing bound approaching QECCs may be designed by exploiting a concatenated code structure, which invokes iterative decoding. Therefore, in this paper we provide an extensive step-by-step tutorial for designing EXtrinsic Information Transfer (EXIT) chart aided concatenated quantum codes based on the underlying quantum-to-classical isomorphism. These design lessons are then exemplified in the context of our proposed Quantum Irregular Convolutional Code (QIRCC), which constitutes the outer component of a concatenated quantum code. The proposed QIRCC can be dynamically adapted to match any given inner code using EXIT charts, hence achieving a performance close to the hashing bound. It is demonstrated that our QIRCC-based optimized design is capable of operating within 0.4 dB of the noise limit

    Reduced-complexity syndrome-based TTCM decoding

    No full text
    The iterative decoder of Turbo Trellis Coded Modulation (TTCM) exchanges extrinsic information between the constituent TCM decoders, which imposes a high computational complexity at the receiver. Therefore we conceive the syndrome-based block decoding of TTCM, which is capable of reducing the decoding complexity by disabling the decoder, when syndrome becomes zero. Quantitatively, we demonstrate that a decoding complexity reduction of at least 17% is attained at high SNRs, with at least 20% and 45% reduction in the 5th and 6th iterations, respectively
    corecore