9 research outputs found

    Generalizations of Fano's Inequality for Conditional Information Measures via Majorization Theory

    Full text link
    Fano's inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano's inequality is generalized to a broad class of information measures, which contains those of Shannon and R\'{e}nyi. When specialized to these measures, it recovers and generalizes the classical inequalities. Key to the derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet. The construction is based on the infinite-dimensional version of Birkhoff's theorem proven by R\'{e}v\'{e}sz [Acta Math. Hungar. 1962, 3, 188{\textendash}198], and the constraint of maintaining a desired marginal distribution is similar to coupling in probability theory. Using our Fano-type inequalities for Shannon's and R\'{e}nyi's information measures, we also investigate the asymptotic behavior of the sequence of Shannon's and R\'{e}nyi's equivocations when the error probabilities vanish. This asymptotic behavior provides a novel characterization of the asymptotic equipartition property (AEP) via Fano's inequality.Comment: 44 pages, 3 figure

    Operational interpretation of Rényi information measures via composite hypothesis testing against product and markov distributions

    Full text link
    © 1963-2012 IEEE. We revisit the problem of asymmetric binary hypothesis testing against a composite alternative hypothesis. We introduce a general framework to treat such problems when the alternative hypothesis adheres to certain axioms. In this case, we find the threshold rate, the optimal error and strong converse exponents (at large deviations from the threshold), and the second order asymptotics (at small deviations from the threshold). We apply our results to find the operational interpretations of various Rényi information measures. In case the alternative hypothesis is comprised of bipartite product distributions, we find that the optimal error and strong converse exponents are determined by the variations of Rényi mutual information. In case the alternative hypothesis consists of tripartite distributions satisfying the Markov property, we find that the optimal exponents are determined by the variations of Rényi conditional mutual information. In either case, the relevant notion of Rényi mutual information depends on the precise choice of the alternative hypothesis. As such, this paper also strengthens the view that different definitions of Rényi mutual information, conditional entropy, and conditional mutual information are adequate depending on the context in which the measures are used

    Generalized Entropies and Metric-Invariant Optimal Countermeasures for Information Leakage Under Symmetric Constraints

    Get PDF
    One again, tuition has risen at the College. However, students believe that it is higher overall than the nationwide jump which recently occurred. Both the students and staff of the College are currently dissatisfied with the library. They believe that its system of numbering should be switched over to something more modern. The funding of campus groups is looked at by the administration. William Darr, from Earlham College, will be appearing at Wooster to display his Japanese prints. Wooster recently beat Wesleyan in basketball, and hopes to go on to a championshiphttps://openworks.wooster.edu/voice1961-1970/1100/thumbnail.jp

    Generalized Entropies and Metric-Invariant Optimal Countermeasures for Information Leakage Under Symmetric Constraints

    Get PDF
    We introduce a novel generalization of entropy and conditional entropy from which most definitions from the literature can be derived as particular cases. Within this general framework, we investigate the problem of designing countermeasures for information leakage. In particular, we seek metric-invariant solutions, i.e., they are robust against the choice of entropy for quantifying the leakage. The problem can be modelled as an information channel from the system to an adversary, and the countermeasures can be seen as modifying this channel in order to minimise the amount of information that the outputs reveal about the inputs. Our main result is to fully solve the problem under the highly symmetrical design constraint that the number of inputs that can produce the same output is capped. Our proof is constructive and the optimal channels and the minimum leakage are derived in closed form.Comment: Accepted to IEEE Transactions on Information Theory, in November 201

    Equivocations, Exponents, and Second-Order Coding Rates Under Various Rényi Information Measures

    No full text
    corecore