5,212 research outputs found
Infinite-alphabet channels and the method of codes of a fixed composition
A proof of the strong converse of the coding theorem for stationary infinite-alphabet channels without memory fulfilling a certain supposition on finite coverings is presented. The proof indicates to which point the method of fixed composition codes can be used for infinite-alphabet channels. The special supposition for the proof of the strong converse (though not the most general one; compare for this: Augustin [1]) is of technical relevance and is satisfied in all cases of practical interest
Elias Bound for General Distances and Stable Sets in Edge-Weighted Graphs
This paper presents an extension of the Elias bound on the minimum distance
of codes for discrete alphabets with general, possibly infinite-valued,
distances. The bound is obtained by combining a previous extension of the Elias
bound, introduced by Blahut, with an extension of a bound previously introduced
by the author which builds upon ideas of Gallager, Lov\'asz and Marton. The
result can in fact be interpreted as a unification of the Elias bound and of
Lov\'asz's bound on graph (or zero-error) capacity, both being recovered as
particular cases of the one presented here. Previous extensions of the Elias
bound by Berlekamp, Blahut and Piret are shown to be included as particular
cases of our bound. Applications to the reliability function are then
discussed.Comment: Accepted, IEEE Transaction on Information Theor
Generalized List Decoding
This paper concerns itself with the question of list decoding for general
adversarial channels, e.g., bit-flip () channels, erasure
channels, (-) channels, channels, real adder
channels, noisy typewriter channels, etc. We precisely characterize when
exponential-sized (or positive rate) -list decodable codes (where the
list size is a universal constant) exist for such channels. Our criterion
asserts that:
"For any given general adversarial channel, it is possible to construct
positive rate -list decodable codes if and only if the set of completely
positive tensors of order- with admissible marginals is not entirely
contained in the order- confusability set associated to the channel."
The sufficiency is shown via random code construction (combined with
expurgation or time-sharing). The necessity is shown by
1. extracting equicoupled subcodes (generalization of equidistant code) from
any large code sequence using hypergraph Ramsey's theorem, and
2. significantly extending the classic Plotkin bound in coding theory to list
decoding for general channels using duality between the completely positive
tensor cone and the copositive tensor cone. In the proof, we also obtain a new
fact regarding asymmetry of joint distributions, which be may of independent
interest.
Other results include
1. List decoding capacity with asymptotically large for general
adversarial channels;
2. A tight list size bound for most constant composition codes
(generalization of constant weight codes);
3. Rederivation and demystification of Blinovsky's [Bli86] characterization
of the list decoding Plotkin points (threshold at which large codes are
impossible);
4. Evaluation of general bounds ([WBBJ]) for unique decoding in the error
correction code setting
Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
This monograph presents a unified treatment of single- and multi-user
problems in Shannon's information theory where we depart from the requirement
that the error probability decays asymptotically in the blocklength. Instead,
the error probabilities for various problems are bounded above by a
non-vanishing constant and the spotlight is shone on achievable coding rates as
functions of the growing blocklengths. This represents the study of asymptotic
estimates with non-vanishing error probabilities.
In Part I, after reviewing the fundamentals of information theory, we discuss
Strassen's seminal result for binary hypothesis testing where the type-I error
probability is non-vanishing and the rate of decay of the type-II error
probability with growing number of independent observations is characterized.
In Part II, we use this basic hypothesis testing result to develop second- and
sometimes, even third-order asymptotic expansions for point-to-point
communication. Finally in Part III, we consider network information theory
problems for which the second-order asymptotics are known. These problems
include some classes of channels with random state, the multiple-encoder
distributed lossless source coding (Slepian-Wolf) problem and special cases of
the Gaussian interference and multiple-access channels. Finally, we discuss
avenues for further research.Comment: Further comments welcom
Joint source-channel coding with feedback
This paper quantifies the fundamental limits of variable-length transmission
of a general (possibly analog) source over a memoryless channel with noiseless
feedback, under a distortion constraint. We consider excess distortion, average
distortion and guaranteed distortion (-semifaithful codes). In contrast to
the asymptotic fundamental limit, a general conclusion is that allowing
variable-length codes and feedback leads to a sizable improvement in the
fundamental delay-distortion tradeoff. In addition, we investigate the minimum
energy required to reproduce source samples with a given fidelity after
transmission over a memoryless Gaussian channel, and we show that the required
minimum energy is reduced with feedback and an average (rather than maximal)
power constraint.Comment: To appear in IEEE Transactions on Information Theor
- …