4,576 research outputs found
A Note on the Deletion Channel Capacity
Memoryless channels with deletion errors as defined by a stochastic channel
matrix allowing for bit drop outs are considered in which transmitted bits are
either independently deleted with probability or unchanged with probability
. Such channels are information stable, hence their Shannon capacity
exists. However, computation of the channel capacity is formidable, and only
some upper and lower bounds on the capacity exist. In this paper, we first show
a simple result that the parallel concatenation of two different independent
deletion channels with deletion probabilities and , in which every
input bit is either transmitted over the first channel with probability of
or over the second one with probability of , is nothing
but another deletion channel with deletion probability of . We then provide an upper bound on the concatenated
deletion channel capacity in terms of the weighted average of ,
and the parameters of the three channels. An interesting consequence
of this bound is that which
enables us to provide an improved upper bound on the capacity of the i.i.d.
deletion channels, i.e., for . This
generalizes the asymptotic result by Dalai as it remains valid for all . Using the same approach we are also able to improve upon existing upper
bounds on the capacity of the deletion/substitution channel.Comment: Submitted to the IEEE Transactions on Information Theor
Write Channel Model for Bit-Patterned Media Recording
We propose a new write channel model for bit-patterned media recording that
reflects the data dependence of write synchronization errors. It is shown that
this model accommodates both substitution-like errors and insertion-deletion
errors whose statistics are determined by an underlying channel state process.
We study information theoretic properties of the write channel model, including
the capacity, symmetric information rate, Markov-1 rate and the zero-error
capacity.Comment: 11 pages, 12 figures, journa
Efficiently Decodable Codes for the Binary Deletion Channel
In the random deletion channel, each bit is deleted independently with probability p. For the random deletion channel, the existence of codes of rate (1-p)/9, and thus bounded away from 0 for any p 0
Maximum Likelihood Upper Bounds on the Capacities of Discrete Information Stable Channels
Motivated by generating information stable processes greedily, we prove a universal maximum likelihood (ML) upper bound on the capacities of discrete information stable channels. The bound is derived leveraging a system of equations obtained via the Karush-Kuhn-Tucker (KKT) conditions. Intriguingly, for some discrete memoryless channels (DMCs), for instance, the BEC and BSC, the associated upper bounds are tight and equal to their capacities. Furthermore, for discrete channels with memory, as a particular example, we apply the ML bound to the BDC. The derived upper bound is a sum-max function related to counting the number of possible ways that a length-m binary subsequence that can be obtained by deleting n β m bits (with n β m close to nd and d denotes the deletion probability) of a length-n binary sequence. A full version of this paper is accessible at [1]
- β¦