123 research outputs found
Generalizations of Fano's Inequality for Conditional Information Measures via Majorization Theory
Fano's inequality is one of the most elementary, ubiquitous, and important
tools in information theory. Using majorization theory, Fano's inequality is
generalized to a broad class of information measures, which contains those of
Shannon and R\'{e}nyi. When specialized to these measures, it recovers and
generalizes the classical inequalities. Key to the derivation is the
construction of an appropriate conditional distribution inducing a desired
marginal distribution on a countably infinite alphabet. The construction is
based on the infinite-dimensional version of Birkhoff's theorem proven by
R\'{e}v\'{e}sz [Acta Math. Hungar. 1962, 3, 188{\textendash}198], and the
constraint of maintaining a desired marginal distribution is similar to
coupling in probability theory. Using our Fano-type inequalities for Shannon's
and R\'{e}nyi's information measures, we also investigate the asymptotic
behavior of the sequence of Shannon's and R\'{e}nyi's equivocations when the
error probabilities vanish. This asymptotic behavior provides a novel
characterization of the asymptotic equipartition property (AEP) via Fano's
inequality.Comment: 44 pages, 3 figure
Recommended from our members
Parallel data compression
Data compression schemes remove data redundancy in communicated and stored data and increase the effective capacities of communication and storage devices. Parallel algorithms and implementations for textual data compression are surveyed. Related concepts from parallel computation and information theory are briefly discussed. Static and dynamic methods for codeword construction and transmission on various models of parallel computation are described. Included are parallel methods which boost system speed by coding data concurrently, and approaches which employ multiple compression techniques to improve compression ratios. Theoretical and empirical comparisons are reported and areas for future research are suggested
Error bounds for parallel communication channels
Error bounds for parallel communication channel
Using data compression and randomization to build an unconditionally secure short key cipher
We consider the problem of constructing an unconditionally secure cipher for the case when the key length is less than the length of the encrypted message. (Unconditional security means that a computationally unbounded adversary cannot obtain information about the encrypted message without the key.)
In this article, we propose data compression and randomization techniques combined with entropically-secure encryption. The resulting cipher can be used for encryption in such a way that the key length does not depend on the entropy or the length of the encrypted message; instead, it is determined by the required security level
Secure, reliable, and efficient communication over the wiretap channel
Secure wireless communication between devices is essential for modern communication systems. Physical-layer security over the wiretap channel may provide an additional level of secrecy beyond the current cryptographic approaches. Given a sender Alice, a legitimate receiver Bob, and a malicious eavesdropper Eve, the wiretap channel occurs when Eve experiences a worse signal-to-noise ratio than Bob. Previous study of the wiretap channel has tended to make assumptions that ignore the reality of wireless communication. This thesis presents a study of short block length codes with the aim of both reliability for Bob and confusion for Eve. The standard approach to wiretap coding is shown to be very inefficient for reliability. Quantifying Eve's confusion in terms of entropy is not solved in many cases, though it is possible for codes with a moderate complexity trellis representation. Using error rate arguments, error correcting codes with steep performance curves turn out to be desirable both for reliability and confusion.Masteroppgave i informatikkINF399MAMN-INFMAMN-PRO
- …