286 research outputs found

    PROPYLA: Privacy Preserving Long-Term Secure Storage

    Full text link
    An increasing amount of sensitive information today is stored electronically and a substantial part of this information (e.g., health records, tax data, legal documents) must be retained over long time periods (e.g., several decades or even centuries). When sensitive data is stored, then integrity and confidentiality must be protected to ensure reliability and privacy. Commonly used cryptographic schemes, however, are not designed for protecting data over such long time periods. Recently, the first storage architecture combining long-term integrity with long-term confidentiality protection was proposed (AsiaCCS'17). However, the architecture only deals with a simplified storage scenario where parts of the stored data cannot be accessed and verified individually. If this is allowed, however, not only the data content itself, but also the access pattern to the data (i.e., the information which data items are accessed at which times) may be sensitive information. Here we present the first long-term secure storage architecture that provides long-term access pattern hiding security in addition to long-term integrity and long-term confidentiality protection. To achieve this, we combine information-theoretic secret sharing, renewable timestamps, and renewable commitments with an information-theoretic oblivious random access machine. Our performance analysis of the proposed architecture shows that achieving long-term integrity, confidentiality, and access pattern hiding security is feasible.Comment: Few changes have been made compared to proceedings versio

    Optimal Forgeries Against Polynomial-Based MACs and GCM

    Get PDF
    Polynomial-based authentication algorithms, such as GCM and Poly1305, have seen widespread adoption in practice. Due to their importance, a significant amount of attention has been given to understanding and improving both proofs and attacks against such schemes. At EUROCRYPT 2005, Bernstein published the best known analysis of the schemes when instantiated with PRPs, thereby establishing the most lenient limits on the amount of data the schemes can process per key. A long line of work, initiated by Handschuh and Preneel at CRYPTO 2008, finds the best known attacks, advancing our understanding of the fragility of the schemes. Yet surprisingly, no known attacks perform as well as the predicted worst-case attacks allowed by Bernstein\u27s analysis, nor has there been any advancement in proofs improving Bernstein\u27s bounds, and the gap between attacks and analysis is significant. We settle the issue by finding a novel attack against polynomial-based authentication algorithms using PRPs, and combine it with new analysis, to show that Bernstein\u27s bound, and our attacks, are optimal

    Low-Complexity Cryptographic Hash Functions

    Get PDF
    Cryptographic hash functions are efficiently computable functions that shrink a long input into a shorter output while achieving some of the useful security properties of a random function. The most common type of such hash functions is collision resistant hash functions (CRH), which prevent an efficient attacker from finding a pair of inputs on which the function has the same output

    Cryptographic Assumptions: A Position Paper

    Get PDF
    The mission of theoretical cryptography is to define and construct provably secure cryptographic protocols and schemes. Without proofs of security, cryptographic constructs offer no guarantees whatsoever and no basis for evaluation and comparison. As most security proofs necessarily come in the form of a reduction between the security claim and an intractability assumption, such proofs are ultimately only as good as the assumptions they are based on. Thus, the complexity implications of every assumption we utilize should be of significant substance, and serve as the yard stick for the value of our proposals. Lately, the field of cryptography has seen a sharp increase in the number of new assumptions that are often complex to define and difficult to interpret. At times, these assumptions are hard to untangle from the constructions which utilize them. We believe that the lack of standards of what is accepted as a reasonable cryptographic assumption can be harmful to the credibility of our field. Therefore, there is a great need for {\em measures} according to which we classify and compare assumptions, as to which are {\it safe} and which are not. In this paper, we propose such a classification and review recently suggested assumptions in this light. This follows the footsteps of Naor (Crypto 2003). Our governing principle is relying on hardness assumptions that are independent of the cryptographic constructions

    A framework for analyzing RFID distance bounding protocols

    Get PDF
    Many distance bounding protocols appropriate for the RFID technology have been proposed recently. Unfortunately, they are commonly designed without any formal approach, which leads to inaccurate analyzes and unfair comparisons. Motivated by this need, we introduce a unied framework that aims to improve analysis and design of distance bounding protocols. Our framework includes a thorough terminology about the frauds, adversary, and prover, thus disambiguating many misleading terms. It also explores the adversary's capabilities and strategies, and addresses the impact of the prover's ability to tamper with his device. It thus introduces some new concepts in the distance bounding domain as the black-box and white-box models, and the relation between the frauds with respect to these models. The relevancy and impact of the framework is nally demonstrated on a study case: Munilla-Peinado distance bounding protocol

    LeakyOhm: Secret Bits Extraction using Impedance Analysis

    Full text link
    The threats of physical side-channel attacks and their countermeasures have been widely researched. Most physical side-channel attacks rely on the unavoidable influence of computation or storage on current consumption or voltage drop on a chip. Such data-dependent influence can be exploited by, for instance, power or electromagnetic analysis. In this work, we introduce a novel non-invasive physical side-channel attack, which exploits the data-dependent changes in the impedance of the chip. Our attack relies on the fact that the temporarily stored contents in registers alter the physical characteristics of the circuit, which results in changes in the die's impedance. To sense such impedance variations, we deploy a well-known RF/microwave method called scattering parameter analysis, in which we inject sine wave signals with high frequencies into the system's power distribution network (PDN) and measure the echo of the signals. We demonstrate that according to the content bits and physical location of a register, the reflected signal is modulated differently at various frequency points enabling the simultaneous and independent probing of individual registers. Such side-channel leakage challenges the tt-probing security model assumption used in masking, which is a prominent side-channel countermeasure. To validate our claims, we mount non-profiled and profiled impedance analysis attacks on hardware implementations of unprotected and high-order masked AES. We show that in the case of the profiled attack, only a single trace is required to recover the secret key. Finally, we discuss how a specific class of hiding countermeasures might be effective against impedance leakage

    Cryptography for Big Data Security

    Get PDF
    As big data collection and analysis becomes prevalent in today’s computing environments there is a growing need for techniques to ensure security of the collected data. To make matters worse, due to its large volume and velocity, big data is commonly stored on distributed or shared computing resources not fully controlled by the data owner. Thus, tools are needed to ensure both the confidentiality of the stored data and the integrity of the analytics results even in untrusted environments. In this chapter, we present several cryptographic approaches for securing big data and discuss the appropriate use scenarios for each. We begin with the problem of securing big data storage. We first address the problem of secure block storage for big data allowing data owners to store and retrieve their data from an untrusted server. We present techniques that allow a data owner to both control access to their data and ensure that none of their data is modified or lost while in storage. However, in most big data applications, it is not sufficient to simply store and retrieve one’s data and a search functionality is necessary to allow one to select only the relevant data. Thus, we present several techniques for searchable encryption allowing database- style queries over encrypted data. We review the performance, functionality, and security provided by each of these schemes and describe appropriate use-cases. However, the volume of big data often makes it infeasible for an analyst to retrieve all relevant data. Instead, it is desirable to be able to perform analytics directly on the stored data without compromising the confidentiality of the data or the integrity of the computation results. We describe several recent cryptographic breakthroughs that make such processing possible for varying classes of analytics. We review the performance and security characteristics of each of these schemes and summarize how they can be used to protect big data analytics especially when deployed in a cloud setting. We hope that the exposition in this chapter will raise awareness of the latest types of tools and protections available for securing big data. We believe better understanding and closer collaboration between the data science and cryptography communities will be critical to enabling the future of big data processing

    Concurrent Non-Malleable Commitments (and More) in 3 Rounds

    Get PDF
    The round complexity of commitment schemes secure against man-in-the-middle attacks has been the focus of extensive research for about 25 years. The recent breakthrough of Goyal et al. [22] showed that 3 rounds are sufficient for (one-left, one-right) non-malleable commitments. This result matches a lower bound of [41]. The state of affairs leaves still open the intriguing problem of constructing 3-round concurrent non-malleable commitment schemes. In this paper we solve the above open problem by showing how to transform any 3-round (one-left one-right) non-malleable commitment scheme (with some extractability property) in a 3-round concurrent nonmalleable commitment scheme. Our transform makes use of complexity leveraging and when instantiated with the construction of [22] gives a 3-round concurrent non-malleable commitment scheme from one-way permutations secure w.r.t. subexponential-time adversaries. We also show a 3-round arguments of knowledge and a 3-round identification scheme secure against concurrent man-in-the-middle attacks

    The Design Space of Lightweight Cryptography

    Get PDF
    International audienceFor constrained devices, standard cryptographic algorithms can be too big, too slow or too energy-consuming. The area of lightweight cryptography studies new algorithms to overcome these problems. In this paper, we will focus on symmetric-key encryption, authentication and hashing. Instead of providing a full overview of this area of research, we will highlight three interesting topics. Firstly, we will explore the generic security of lightweight constructions. In particular, we will discuss considerations for key, block and tag sizes, and explore the topic of instantiating a pseudorandom permutation (PRP) with a non-ideal block cipher construction. This is inspired by the increasing prevalence of lightweight designs that are not secure against related-key attacks, such as PRINCE, PRIDE or Chaskey. Secondly, we explore the efficiency of cryptographic primitives. In particular, we investigate the impact on efficiency when the input size of a primitive doubles. Lastly, we provide some considerations for cryptographic design. We observe that applications do not always use cryptographic algorithms as they were intended, which negatively impacts the security and/or efficiency of the resulting implementations

    More Rounds, Less Security?

    Get PDF
    This paper focuses on a surprising class of cryptanalysis results for symmetric-key primitives: when the number of rounds of the primitive is increased, the complexity of the cryptanalysis result decreases. Our primary target will be primitives that consist of identical round functions, such as PBKDF1, the Unix password hashing algorithm, and the Chaskey MAC function. However, some of our results also apply to constructions with non-identical rounds, such as the PRIDE block cipher. First, we construct distinguishers for which the data complexity decreases when the number of rounds is increased. They are based on two well-known observations: iterating a random permutation increases the expected number of fixed points, and iterating a random function decreases the expected number of image points. We explain that these effects also apply to components of cryptographic primitives, such as a round of a block cipher. Second, we introduce a class of key-recovery and preimage-finding techniques that correspond to exhaustive search, however on a smaller part (e.g. one round) of the primitive. As the time complexity of a cryptanalysis result is usually measured by the number of full-round evaluations of the primitive, increasing the number of rounds will lower the time complexity. None of the observations in this paper result in more than a small speed-up over exhaustive search. Therefore, for lightweight applications, implementation advantages may outweigh the presence of these observations
    corecore