122 research outputs found

    On the Commitment Capacity of Unfair Noisy Channels

    Get PDF
    Noisy channels are a valuable resource from a cryptographic point of view. They can be used for exchanging secret-keys as well as realizing other cryptographic primitives such as commitment and oblivious transfer. To be really useful, noisy channels have to be consider in the scenario where a cheating party has some degree of control over the channel characteristics. Damg\r{a}rd et al. (EUROCRYPT 1999) proposed a more realistic model where such level of control is permitted to an adversary, the so called unfair noisy channels, and proved that they can be used to obtain commitment and oblivious transfer protocols. Given that noisy channels are a precious resource for cryptographic purposes, one important question is determining the optimal rate in which they can be used. The commitment capacity has already been determined for the cases of discrete memoryless channels and Gaussian channels. In this work we address the problem of determining the commitment capacity of unfair noisy channels. We compute a single-letter characterization of the commitment capacity of unfair noisy channels. In the case where an adversary has no control over the channel (the fair case) our capacity reduces to the well-known capacity of a discrete memoryless binary symmetric channel

    Commitment and Oblivious Transfer in the Bounded Storage Model with Errors

    Get PDF
    The bounded storage model restricts the memory of an adversary in a cryptographic protocol, rather than restricting its computational power, making information theoretically secure protocols feasible. We present the first protocols for commitment and oblivious transfer in the bounded storage model with errors, i.e., the model where the public random sources available to the two parties are not exactly the same, but instead are only required to have a small Hamming distance between themselves. Commitment and oblivious transfer protocols were known previously only for the error-free variant of the bounded storage model, which is harder to realize

    On the Efficiency of Classical and Quantum Secure Function Evaluation

    Full text link
    We provide bounds on the efficiency of secure one-sided output two-party computation of arbitrary finite functions from trusted distributed randomness in the statistical case. From these results we derive bounds on the efficiency of protocols that use different variants of OT as a black-box. When applied to implementations of OT, these bounds generalize most known results to the statistical case. Our results hold in particular for transformations between a finite number of primitives and for any error. In the second part we study the efficiency of quantum protocols implementing OT. While most classical lower bounds for perfectly secure reductions of OT to distributed randomness still hold in the quantum setting, we present a statistically secure protocol that violates these bounds by an arbitrarily large factor. We then prove a weaker lower bound that does hold in the statistical quantum setting and implies that even quantum protocols cannot extend OT. Finally, we present two lower bounds for reductions of OT to commitments and a protocol based on string commitments that is optimal with respect to both of these bounds

    Modulus Computational Entropy

    Full text link
    The so-called {\em leakage-chain rule} is a very important tool used in many security proofs. It gives an upper bound on the entropy loss of a random variable XX in case the adversary who having already learned some random variables Z1,,ZZ_{1},\ldots,Z_{\ell} correlated with XX, obtains some further information Z+1Z_{\ell+1} about XX. Analogously to the information-theoretic case, one might expect that also for the \emph{computational} variants of entropy the loss depends only on the actual leakage, i.e. on Z+1Z_{\ell+1}. Surprisingly, Krenn et al.\ have shown recently that for the most commonly used definitions of computational entropy this holds only if the computational quality of the entropy deteriorates exponentially in (Z1,,Z)|(Z_{1},\ldots,Z_{\ell})|. This means that the current standard definitions of computational entropy do not allow to fully capture leakage that occurred "in the past", which severely limits the applicability of this notion. As a remedy for this problem we propose a slightly stronger definition of the computational entropy, which we call the \emph{modulus computational entropy}, and use it as a technical tool that allows us to prove a desired chain rule that depends only on the actual leakage and not on its history. Moreover, we show that the modulus computational entropy unifies other,sometimes seemingly unrelated, notions already studied in the literature in the context of information leakage and chain rules. Our results indicate that the modulus entropy is, up to now, the weakest restriction that guarantees that the chain rule for the computational entropy works. As an example of application we demonstrate a few interesting cases where our restricted definition is fulfilled and the chain rule holds.Comment: Accepted at ICTS 201

    Physical Unclonable Functions and Their Applications to Vehicle System Security

    Full text link

    Privacy-aware Security Applications in the Era of Internet of Things

    Get PDF
    In this dissertation, we introduce several novel privacy-aware security applications. We split these contributions into three main categories: First, to strengthen the current authentication mechanisms, we designed two novel privacy-aware alternative complementary authentication mechanisms, Continuous Authentication (CA) and Multi-factor Authentication (MFA). Our first system is Wearable-assisted Continuous Authentication (WACA), where we used the sensor data collected from a wrist-worn device to authenticate users continuously. Then, we improved WACA by integrating a noise-tolerant template matching technique called NTT-Sec to make it privacy-aware as the collected data can be sensitive. We also designed a novel, lightweight, Privacy-aware Continuous Authentication (PACA) protocol. PACA is easily applicable to other biometric authentication mechanisms when feature vectors are represented as fixed-length real-valued vectors. In addition to CA, we also introduced a privacy-aware multi-factor authentication method, called PINTA. In PINTA, we used fuzzy hashing and homomorphic encryption mechanisms to protect the users\u27 sensitive profiles while providing privacy-preserving authentication. For the second privacy-aware contribution, we designed a multi-stage privacy attack to smart home users using the wireless network traffic generated during the communication of the devices. The attack works even on the encrypted data as it is only using the metadata of the network traffic. Moreover, we also designed a novel solution based on the generation of spoofed traffic. Finally, we introduced two privacy-aware secure data exchange mechanisms, which allow sharing the data between multiple parties (e.g., companies, hospitals) while preserving the privacy of the individual in the dataset. These mechanisms were realized with the combination of Secure Multiparty Computation (SMC) and Differential Privacy (DP) techniques. In addition, we designed a policy language, called Curie Policy Language (CPL), to handle the conflicting relationships among parties. The novel methods, attacks, and countermeasures in this dissertation were verified with theoretical analysis and extensive experiments with real devices and users. We believe that the research in this dissertation has far-reaching implications on privacy-aware alternative complementary authentication methods, smart home user privacy research, as well as the privacy-aware and secure data exchange methods

    Cryptography in the Bounded Quantum-Storage Model

    Get PDF
    We initiate the study of two-party cryptographic primitives with unconditional security, assuming that the adversary’s quantum memory is of bounded size. We show that oblivious transfer and bit commitment can be implemented in this model using protocols where honest parties need no quantum memory, whereas an adversarial player needs quantum memory of size at least n/2 in order to break the protocol, where n is the number of qubits transmitted. This is in sharp contrast to the classical bounded-memory model, where we can only tolerate adversaries with memory of size quadratic in honest players’ memory size. Our protocols are efficient and noninteractive and can be implemented using today’s technology. On the technical side, a new entropic uncertainty relation involving min-entropy is established

    Naturally Rehearsing Passwords

    Full text link
    We introduce quantitative usability and security models to guide the design of password management schemes --- systematic strategies to help users create and remember multiple passwords. In the same way that security proofs in cryptography are based on complexity-theoretic assumptions (e.g., hardness of factoring and discrete logarithm), we quantify usability by introducing usability assumptions. In particular, password management relies on assumptions about human memory, e.g., that a user who follows a particular rehearsal schedule will successfully maintain the corresponding memory. These assumptions are informed by research in cognitive science and validated through empirical studies. Given rehearsal requirements and a user's visitation schedule for each account, we use the total number of extra rehearsals that the user would have to do to remember all of his passwords as a measure of the usability of the password scheme. Our usability model leads us to a key observation: password reuse benefits users not only by reducing the number of passwords that the user has to memorize, but more importantly by increasing the natural rehearsal rate for each password. We also present a security model which accounts for the complexity of password management with multiple accounts and associated threats, including online, offline, and plaintext password leak attacks. Observing that current password management schemes are either insecure or unusable, we present Shared Cues--- a new scheme in which the underlying secret is strategically shared across accounts to ensure that most rehearsal requirements are satisfied naturally while simultaneously providing strong security. The construction uses the Chinese Remainder Theorem to achieve these competing goals

    Maintaining secrecy when information leakage is unavoidable

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.Includes bibliographical references (p. 109-115).(cont.) We apply the framework to get new results, creating (a) encryption schemes with very short keys, and (b) hash functions that leak no information about their input, yet-paradoxically-allow testing if a candidate vector is close to the input. One of the technical contributions of this research is to provide new, cryptographic uses of mathematical tools from complexity theory known as randomness extractors.Sharing and maintaining long, random keys is one of the central problems in cryptography. This thesis provides about ensuring the security of a cryptographic key when partial information about it has been, or must be, leaked to an adversary. We consider two basic approaches: 1. Extracting a new, shorter, secret key from one that has been partially compromised. Specifically, we study the use of noisy data, such as biometrics and personal information, as cryptographic keys. Such data can vary drastically from one measurement to the next. We would like to store enough information to handle these variations, without having to rely on any secure storage-in particular, without storing the key itself in the clear. We solve the problem by casting it in terms of key extraction. We give a precise definition of what "security" should mean in this setting, and design practical, general solutions with rigorous analyses. Prior to this work, no solutions were known with satisfactory provable security guarantees. 2. Ensuring that whatever is revealed is not actually useful. This is most relevant when the key itself is sensitive-for example when it is based on a person's iris scan or Social Security Number. This second approach requires the user to have some control over exactly what information is revealed, but this is often the case: for example, if the user must reveal enough information to allow another user to correct errors in a corrupted key. How can the user ensure that whatever information the adversary learns is not useful to her? We answer by developing a theoretical framework for separating leaked information from useful information. Our definition strengthens the notion of entropic security, considered before in a few different contexts.by Adam Davison Smith.Ph.D

    Security and privacy issues in implantable medical devices: A comprehensive survey

    Get PDF
    Bioengineering is a field in expansion. New technologies are appearing to provide a more efficient treatment of diseases or human deficiencies. Implantable Medical Devices (IMDs) constitute one example, these being devices with more computing, decision making and communication capabilities. Several research works in the computer security field have identified serious security and privacy risks in IMDs that could compromise the implant and even the health of the patient who carries it. This article surveys the main security goals for the next generation of IMDs and analyzes the most relevant protection mechanisms proposed so far. On the one hand, the security proposals must have into consideration the inherent constraints of these small and implanted devices: energy, storage and computing power. On the other hand, proposed solutions must achieve an adequate balance between the safety of the patient and the security level offered, with the battery lifetime being another critical parameter in the design phase
    corecore