4 research outputs found

    Modulus Computational Entropy

    Full text link
    The so-called {\em leakage-chain rule} is a very important tool used in many security proofs. It gives an upper bound on the entropy loss of a random variable XX in case the adversary who having already learned some random variables Z1,…,ZℓZ_{1},\ldots,Z_{\ell} correlated with XX, obtains some further information Zℓ+1Z_{\ell+1} about XX. Analogously to the information-theoretic case, one might expect that also for the \emph{computational} variants of entropy the loss depends only on the actual leakage, i.e. on Zℓ+1Z_{\ell+1}. Surprisingly, Krenn et al.\ have shown recently that for the most commonly used definitions of computational entropy this holds only if the computational quality of the entropy deteriorates exponentially in ∣(Z1,…,Zℓ)∣|(Z_{1},\ldots,Z_{\ell})|. This means that the current standard definitions of computational entropy do not allow to fully capture leakage that occurred "in the past", which severely limits the applicability of this notion. As a remedy for this problem we propose a slightly stronger definition of the computational entropy, which we call the \emph{modulus computational entropy}, and use it as a technical tool that allows us to prove a desired chain rule that depends only on the actual leakage and not on its history. Moreover, we show that the modulus computational entropy unifies other,sometimes seemingly unrelated, notions already studied in the literature in the context of information leakage and chain rules. Our results indicate that the modulus entropy is, up to now, the weakest restriction that guarantees that the chain rule for the computational entropy works. As an example of application we demonstrate a few interesting cases where our restricted definition is fulfilled and the chain rule holds.Comment: Accepted at ICTS 201

    A counterexample to the chain rule for conditional HILL entropy

    Get PDF
    Most entropy notions H(.) like Shannon or min-entropy satisfy a chain rule stating that for random variables X,Z, and A we have H(X|Z,A)≥H(X|Z)−|A|. That is, by conditioning on A the entropy of X can decrease by at most the bitlength |A| of A. Such chain rules are known to hold for some computational entropy notions like Yao’s and unpredictability-entropy. For HILL entropy, the computational analogue of min-entropy, the chain rule is of special interest and has found many applications, including leakage-resilient cryptography, deterministic encryption, and memory delegation. These applications rely on restricted special cases of the chain rule. Whether the chain rule for conditional HILL entropy holds in general was an open problem for which we give a strong negative answer: we construct joint distributions (X,Z,A), where A is a distribution over a single bit, such that the HILL entropy H HILL (X|Z) is large but H HILL (X|Z,A) is basically zero. Our counterexample just makes the minimal assumption that NP⊈P/poly. Under the stronger assumption that injective one-way function exist, we can make all the distributions efficiently samplable. Finally, we show that some more sophisticated cryptographic objects like lossy functions can be used to sample a distribution constituting a counterexample to the chain rule making only a single invocation to the underlying object

    A Unified Approach to Deterministic Encryption: New Constructions and a Connection to Computational Entropy

    Get PDF
    This paper addresses deterministic public-key encryption schemes (DE), which are designed to provide meaningful security when only source of randomness in the encryption process comes from the message itself. We propose a general construction of DE that unifies prior work and gives novel schemes. Specifically, its instantiations include: -The first construction from any trapdoor function that has sufficiently many hardcore bits. -The first construction that provides bounded multi-message security (assuming lossy trapdoor functions). The security proofs for these schemes are enabled by three tools that are of broader interest: - A weaker and more precise sufficient condition for semantic security on a high-entropy message distribution. Namely, we show that to establish semantic security on a distribution M of messages, it suffices to establish indistinguishability for all conditional distribution M|E, where E is an event of probability at least 1/4. (Prior work required indistinguishability on all distributions of a given entropy.) - A result about computational entropy of conditional distributions. Namely, we show that conditioning on an event E of probability p reduces the quality of computational entropy by a factor of p and its quantity by log_2 1/p. - A generalization of leftover hash lemma to correlated distributions. We also extend our result about computational entropy to the average case, which is useful in reasoning about leakage-resilient cryptography: leaking \lambda bits of information reduces the quality of computational entropy by a factor of 2^\lambda and its quantity by \lambda
    corecore