17,042 research outputs found

    Modulus Computational Entropy

    Full text link
    The so-called {\em leakage-chain rule} is a very important tool used in many security proofs. It gives an upper bound on the entropy loss of a random variable XX in case the adversary who having already learned some random variables Z1,,ZZ_{1},\ldots,Z_{\ell} correlated with XX, obtains some further information Z+1Z_{\ell+1} about XX. Analogously to the information-theoretic case, one might expect that also for the \emph{computational} variants of entropy the loss depends only on the actual leakage, i.e. on Z+1Z_{\ell+1}. Surprisingly, Krenn et al.\ have shown recently that for the most commonly used definitions of computational entropy this holds only if the computational quality of the entropy deteriorates exponentially in (Z1,,Z)|(Z_{1},\ldots,Z_{\ell})|. This means that the current standard definitions of computational entropy do not allow to fully capture leakage that occurred "in the past", which severely limits the applicability of this notion. As a remedy for this problem we propose a slightly stronger definition of the computational entropy, which we call the \emph{modulus computational entropy}, and use it as a technical tool that allows us to prove a desired chain rule that depends only on the actual leakage and not on its history. Moreover, we show that the modulus computational entropy unifies other,sometimes seemingly unrelated, notions already studied in the literature in the context of information leakage and chain rules. Our results indicate that the modulus entropy is, up to now, the weakest restriction that guarantees that the chain rule for the computational entropy works. As an example of application we demonstrate a few interesting cases where our restricted definition is fulfilled and the chain rule holds.Comment: Accepted at ICTS 201

    Quantitative Information Flow and Applications to Differential Privacy

    Get PDF
    International audienceSecure information flow is the problem of ensuring that the information made publicly available by a computational system does not leak information that should be kept secret. Since it is practically impossible to avoid leakage entirely, in recent years there has been a growing interest in considering the quantitative aspects of information flow, in order to measure and compare the amount of leakage. Information theory is widely regarded as a natural framework to provide firm foundations to quantitative information flow. In this notes we review the two main information-theoretic approaches that have been investigated: the one based on Shannon entropy, and the one based on Rényi min-entropy. Furthermore, we discuss some applications in the area of privacy. In particular, we consider statistical databases and the recently-proposed notion of differential privacy. Using the information-theoretic view, we discuss the bound that differential privacy induces on leakage, and the trade-off between utility and privac

    Strong Leakage Resilient Encryption: Enhancing Data Confidentiality by Hiding Partial Ciphertext

    Get PDF
    Leakage-resilient encryption is a powerful tool to protect data confidentiality against side channel attacks. In this work, we introduce a new and strong leakage setting to counter backdoor (or Trojan horse) plus covert channel attack, by relaxing the restrictions on leakage. We allow \emph{bounded} leakage at \emph{anytime} and \emph{anywhere} and over \emph{anything}. Our leakage threshold (e.g. 10000 bits) could be much larger than typical secret key (e.g. AES key or RSA private key) size. Under such a strong leakage setting, we propose an efficient encryption scheme which is semantic secure in standard setting (i.e. without leakage) and can tolerate strong continuous leakage. We manage to construct such a secure scheme under strong leakage setting, by hiding partial (e.g. 1%1\%) ciphertext as secure as we hide the secret key using a small amount of more secure hardware resource, so that it is almost equally difficult for any adversary to steal information regarding this well-protected partial ciphertext or the secret key. We remark that, the size of such well-protected small portion of ciphertext is chosen to be much larger than the leakage threshold. We provide concrete and practical examples of such more secure hardware resource for data communication and data storage. Furthermore, we also introduce a new notion of computational entropy, as a sort of computational version of Kolmogorov complexity. Our quantitative analysis shows that, hiding partial ciphertext is a powerful countermeasure, which enables us to achieve higher security level than existing approaches in case of backdoor plus covert channel attacks. We also show the relationship between our new notion of computational entropy and existing relevant concepts, including Shannon-Entropy, Yao-Entropy, Hill-Entropy, All-or-Nothing Transform, and Exposure Resilient Function. This new computation entropy formulation may have independent interests

    Information-Theoretic Meaning of Quantum Information Flow and Its Applications to Amplitude Amplification Algorithms

    Full text link
    The advantages of quantum information processing are in many cases obtained as consequences of quantum interactions, especially for computational tasks where two-qubit interactions are essential. In this work, we establish the framework of analyzing and quantifying loss or gain of information on a quantum system when the system interacts with its environment. We show that the information flow, the theoretical method of characterizing (non-)Markovianity of quantum dynamics, corresponds to the rate of the minimum uncertainty about the system given quantum side information. Thereafter, we analyze the information exchange among subsystems that are under the performance of quantum algorithms, in particular, the amplitude amplification algorithms where the computational process relies fully on quantum evolution. Different realizations of the algorithm are considered, such as i)quantum circuits, ii) analog computation, and iii) adiabatic computation. It is shown that, in all the cases, our formalism provides insights about the process of amplifying the amplitude from the information flow or leakage on the subsystems.Comment: 7 pages, 5 figures, close to the published versio

    Towards an Information Theoretic Analysis of Searchable Encryption (Extended Version)

    Get PDF
    Searchable encryption is a technique that allows a client to store data in encrypted form on a curious server, such that data can be retrieved while leaking a minimal amount of information to the server. Many searchable encryption schemes have been proposed and proved secure in their own computational model. In this paper we propose a generic model for the analysis of searchable encryptions. We then identify the security parameters of searchable encryption schemes and prove information theoretical bounds on the security of the parameters. We argue that perfectly secure searchable encryption schemes cannot be efficient. We classify the seminal schemes in two categories: the schemes that leak information upfront during the storage phase, and schemes that leak some information at every search. This helps designers to choose the right scheme for an application

    Fully leakage-resilient signatures revisited: Graceful degradation, noisy leakage, and construction in the bounded-retrieval model

    Get PDF
    We construct new leakage-resilient signature schemes. Our schemes remain unforgeable against an adversary leaking arbitrary (yet bounded) information on the entire state of the signer (sometimes known as fully leakage resilience), including the random coin tosses of the signing algorithm. The main feature of our constructions is that they offer a graceful degradation of security in situations where standard existential unforgeability is impossible
    corecore