29 research outputs found

    A Cross Entropy Interpretation of R{\'{e}}nyi Entropy for α\alpha-leakage

    Full text link
    This paper proposes an α\alpha-leakage measure for α[0,)\alpha\in[0,\infty) by a cross entropy interpretation of R{\'{e}}nyi entropy. While R\'{e}nyi entropy was originally defined as an ff-mean for f(t)=exp((1α)t)f(t) = \exp((1-\alpha)t), we reveal that it is also a f~\tilde{f}-mean cross entropy measure for f~(t)=exp(1ααt)\tilde{f}(t) = \exp(\frac{1-\alpha}{\alpha}t). Minimizing this R\'{e}nyi cross-entropy gives R\'{e}nyi entropy, by which the prior and posterior uncertainty measures are defined corresponding to the adversary's knowledge gain on sensitive attribute before and after data release, respectively. The α\alpha-leakage is proposed as the difference between f~\tilde{f}-mean prior and posterior uncertainty measures, which is exactly the Arimoto mutual information. This not only extends the existing α\alpha-leakage from α[1,)\alpha \in [1,\infty) to the overall R{\'{e}}nyi order range α[0,)\alpha \in [0,\infty) in a well-founded way with α=0\alpha=0 referring to nonstochastic leakage, but also reveals that the existing maximal leakage is a f~\tilde{f}-mean of an elementary α\alpha-leakage for all α[0,)\alpha \in [0,\infty), which generalizes the existing pointwise maximal leakage.Comment: 7 pages; 1 figur

    Entropic Steering Criteria: Applications to Bipartite and Tripartite Systems

    Full text link
    The effect of quantum steering describes a possible action at a distance via local measurements. Whereas many attempts on characterizing steerability have been pursued, answering the question as to whether a given state is steerable or not remains a difficult task. Here, we investigate the applicability of a recently proposed method for building steering criteria from generalized entropic uncertainty relations. This method works for any entropy which satisfy the properties of (i) (pseudo-) additivity for independent distributions; (ii) state independent entropic uncertainty relation (EUR); and (iii) joint convexity of a corresponding relative entropy. Our study extends the former analysis to Tsallis and R\'enyi entropies on bipartite and tripartite systems. As examples, we investigate the steerability of the three-qubit GHZ and W states.Comment: 27 pages, 8 figures. Published version. Title change

    Generalizations of Fano's Inequality for Conditional Information Measures via Majorization Theory

    Full text link
    Fano's inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano's inequality is generalized to a broad class of information measures, which contains those of Shannon and R\'{e}nyi. When specialized to these measures, it recovers and generalizes the classical inequalities. Key to the derivation is the construction of an appropriate conditional distribution inducing a desired marginal distribution on a countably infinite alphabet. The construction is based on the infinite-dimensional version of Birkhoff's theorem proven by R\'{e}v\'{e}sz [Acta Math. Hungar. 1962, 3, 188{\textendash}198], and the constraint of maintaining a desired marginal distribution is similar to coupling in probability theory. Using our Fano-type inequalities for Shannon's and R\'{e}nyi's information measures, we also investigate the asymptotic behavior of the sequence of Shannon's and R\'{e}nyi's equivocations when the error probabilities vanish. This asymptotic behavior provides a novel characterization of the asymptotic equipartition property (AEP) via Fano's inequality.Comment: 44 pages, 3 figure

    Leakage-Minimal Design: Universality, Limitations, and Applications

    Get PDF

    Collision Entropy Estimation in a One-Line Formula

    Get PDF
    We address the unsolved question of how best to estimate the collision entropy, also called quadratic or second order Rényi entropy. Integer-order Rényi entropies are synthetic indices useful for the characterization of probability distributions. In recent decades, numerous studies have been conducted to arrive at their valid estimates starting from experimental data, so to derive suitable classification methods for the underlying processes, but optimal solutions have not been reached yet. Limited to the estimation of collision entropy, a one-line formula is presented here. The results of some specific Monte Carlo experiments give evidence of the validity of this estimator even for the very low densities of the data spread in high-dimensional sample spaces. The method strengths are unbiased consistency, generality and minimum computational cost

    Guesswork with Quantum Side Information

    Get PDF
    What is the minimum number of guesses needed on average to guess a realization of a random variable correctly The answer to this question led to the introduction of a quantity called guesswork by Massey in 1994, which can be viewed as an alternate security criterion to entropy. In this paper, we consider the guesswork in the presence of quantum side information, and show that a general sequential guessing strategy is equivalent to performing a single quantum measurement and choosing a guessing strategy based on the outcome. We use this result to deduce entropic one-shot and asymptotic bounds on the guesswork in the presence of quantum side information, and to formulate a semi-definite program (SDP) to calculate the quantity. We evaluate the guesswork for a simple example involving the BB84 states, both numerically and analytically, and we prove a continuity result that certifies the security of slightly imperfect key states when the guesswork is used as the security criterion
    corecore