23,463 research outputs found

    The adversarial joint source-channel problem

    Get PDF
    This paper introduces the problem of joint source-channel coding in the setup where channel errors are adversarial and the distortion is worst case. Unlike the situation in the case of stochastic source-channel model, the separation principle does not hold in adversarial setup. This surprising observation demonstrates that designing good distortion-correcting codes cannot be done by serially concatenating good covering codes with good error-correcting codes. The problem of the joint code design is addressed and some initial results are offered

    Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip

    Full text link
    Although Shannon theory states that it is asymptotically optimal to separate the source and channel coding as two independent processes, in many practical communication scenarios this decomposition is limited by the finite bit-length and computational power for decoding. Recently, neural joint source-channel coding (NECST) is proposed to sidestep this problem. While it leverages the advancements of amortized inference and deep learning to improve the encoding and decoding process, it still cannot always achieve compelling results in terms of compression and error correction performance due to the limited robustness of its learned coding networks. In this paper, motivated by the inherent connections between neural joint source-channel coding and discrete representation learning, we propose a novel regularization method called Infomax Adversarial-Bit-Flip (IABF) to improve the stability and robustness of the neural joint source-channel coding scheme. More specifically, on the encoder side, we propose to explicitly maximize the mutual information between the codeword and data; while on the decoder side, the amortized reconstruction is regularized within an adversarial framework. Extensive experiments conducted on various real-world datasets evidence that our IABF can achieve state-of-the-art performances on both compression and error correction benchmarks and outperform the baselines by a significant margin.Comment: AAAI202
    • …
    corecore