4,576 research outputs found

    A Note on the Deletion Channel Capacity

    Full text link
    Memoryless channels with deletion errors as defined by a stochastic channel matrix allowing for bit drop outs are considered in which transmitted bits are either independently deleted with probability dd or unchanged with probability 1βˆ’d1-d. Such channels are information stable, hence their Shannon capacity exists. However, computation of the channel capacity is formidable, and only some upper and lower bounds on the capacity exist. In this paper, we first show a simple result that the parallel concatenation of two different independent deletion channels with deletion probabilities d1d_1 and d2d_2, in which every input bit is either transmitted over the first channel with probability of Ξ»\lambda or over the second one with probability of 1βˆ’Ξ»1-\lambda, is nothing but another deletion channel with deletion probability of d=Ξ»d1+(1βˆ’Ξ»)d2d=\lambda d_1+(1-\lambda)d_2. We then provide an upper bound on the concatenated deletion channel capacity C(d)C(d) in terms of the weighted average of C(d1)C(d_1), C(d2)C(d_2) and the parameters of the three channels. An interesting consequence of this bound is that C(Ξ»d1+(1βˆ’Ξ»))≀λC(d1)C(\lambda d_1+(1-\lambda))\leq \lambda C(d_1) which enables us to provide an improved upper bound on the capacity of the i.i.d. deletion channels, i.e., C(d)≀0.4143(1βˆ’d)C(d)\leq 0.4143(1-d) for dβ‰₯0.65d\geq 0.65. This generalizes the asymptotic result by Dalai as it remains valid for all dβ‰₯0.65d\geq 0.65. Using the same approach we are also able to improve upon existing upper bounds on the capacity of the deletion/substitution channel.Comment: Submitted to the IEEE Transactions on Information Theor

    Write Channel Model for Bit-Patterned Media Recording

    Full text link
    We propose a new write channel model for bit-patterned media recording that reflects the data dependence of write synchronization errors. It is shown that this model accommodates both substitution-like errors and insertion-deletion errors whose statistics are determined by an underlying channel state process. We study information theoretic properties of the write channel model, including the capacity, symmetric information rate, Markov-1 rate and the zero-error capacity.Comment: 11 pages, 12 figures, journa

    Efficiently Decodable Codes for the Binary Deletion Channel

    Get PDF
    In the random deletion channel, each bit is deleted independently with probability p. For the random deletion channel, the existence of codes of rate (1-p)/9, and thus bounded away from 0 for any p 0

    Maximum Likelihood Upper Bounds on the Capacities of Discrete Information Stable Channels

    Get PDF
    Motivated by generating information stable processes greedily, we prove a universal maximum likelihood (ML) upper bound on the capacities of discrete information stable channels. The bound is derived leveraging a system of equations obtained via the Karush-Kuhn-Tucker (KKT) conditions. Intriguingly, for some discrete memoryless channels (DMCs), for instance, the BEC and BSC, the associated upper bounds are tight and equal to their capacities. Furthermore, for discrete channels with memory, as a particular example, we apply the ML bound to the BDC. The derived upper bound is a sum-max function related to counting the number of possible ways that a length-m binary subsequence that can be obtained by deleting n – m bits (with n – m close to nd and d denotes the deletion probability) of a length-n binary sequence. A full version of this paper is accessible at [1]
    • …
    corecore