67 research outputs found

    Robust Combiner for Obfuscators

    Get PDF
    Practical software hardening schemes are heuristic and are not proven to be secure. One technique to enhance security is {\em robust combiners}. An algorithm CC is a robust combiner for specification SS, e.g., privacy, if for any two implementations XX and YY, of a cryptographic scheme, the combined scheme C(X,Y)C(X,Y) satisfies SS provided {\em either} XX {\em or} YY satisfy SS. We present the first robust combiner for software hardening, specifically for obfuscation \cite{barak:obfuscation}. Obfuscators are software hardening techniques that are employed to protect execution of programs in remote, hostile environment. Obfuscators protect the code (and secret data) of the program that is sent to the remote host for execution. Robust combiners are particularly important for software hardening, where there is no standard whose security is established. In addition, robust combiners for software hardening are interesting from software engineering perspective since they introduce new techniques of software only fault tolerance

    Robust Combiners for Software Hardening

    Get PDF
    All practical software hardening schemes, as well as practical encryption schemes, e.g., AES, were not proven to be secure. One technique to enhance security is {\em robust combiners}. An algorithm CC is a robust combiner for specification SS, e.g., privacy, if for any two implementations XX and YY, of a cryptographic scheme, the combined scheme C(X,Y)C(X,Y) satisfies SS provided {\em either} XX {\em or} YY satisfy SS. We present the first robust combiners for software hardening, specifically for obfuscation \cite{barak:obfuscation}, and for White-Box Remote Program Execution (\w) \cite{herzberg2009towards}. WBRPE and obfuscators are software hardening techniques that are employed to protect execution of programs in remote, hostile environment. \w\ provides a software only platform allowing secure execution of programs on untrusted, remote hosts, ensuring privacy of the program, and of the inputs to the program, as well as privacy and integrity of the result of the computation. Obfuscators protect the code (and secret data) of the program that is sent to the remote host for execution. Robust combiners are particularly important for software hardening, where there is no standard whose security is established. In addition, robust combiners for software hardening are interesting from software engineering perspective since they introduce new techniques of reductions and code manipulation

    Encryptor Combiners: A Unified Approach to Multiparty NIKE, (H)IBE, and Broadcast Encryption

    Get PDF
    We define the concept of an encryptor combiner. Roughly, such a combiner takes as input n public keys for a public key encryption scheme, and produces a new combined public key. Anyone knowing a secret key for one of the input public keys can learn the secret key for the combined public key, but an outsider who just knows the input public keys (who can therefore compute the combined public key for himself) cannot decrypt ciphertexts from the combined public key. We actually think of public keys more generally as encryption procedures, which can correspond to, say, encrypting to a particular identity under an IBE scheme or encrypting to a set of attributes under an ABE scheme. We show that encryptor combiners satisfying certain natural properties can give natural constructions of multi-party non-interactive key exchange, low-overhead broadcast encryption, and hierarchical identity-based encryption. We then show how to construct two different encryptor combiners. Our first is built from universal samplers (which can in turn be built from indistinguishability obfuscation) and is sufficient for each application above, in some cases improving on existing obfuscation-based constructions. Our second is built from lattices, and is sufficient for hierarchical identity-based encryption. Thus, encryptor combiners serve as a new abstraction that (1) is a useful tool for designing cryptosystems, (2) unifies constructing hierarchical IBE from vastly different assumptions, and (3) provides a target for instantiating obfuscation applications from better tools

    Combiners for Functional Encryption, Unconditionally

    Get PDF
    Functional encryption (FE) combiners allow one to combine many candidates for a functional encryption scheme, possibly based on different computational assumptions, into another functional encryption candidate with the guarantee that the resulting candidate is secure as long as at least one of the original candidates is secure. The fundamental question in this area is whether FE combiners exist. There have been a series of works (Ananth et. al. (CRYPTO \u2716), Ananth-Jain-Sahai (EUROCRYPT \u2717), Ananth et. al (TCC \u2719)) on constructing FE combiners from various assumptions. We give the first unconditional construction of combiners for functional encryption, resolving this question completely. Our construction immediately implies an unconditional universal functional encryption scheme, an FE scheme that is secure if such an FE scheme exists. Previously such results either relied on algebraic assumptions or required subexponential security assumptions

    From FE Combiners to Secure MPC and Back

    Get PDF
    Functional encryption (FE) has incredible applications towards computing on encrypted data. However, constructing the most general form of this primitive has remained elusive. Although some candidate constructions exist, they rely on nonstandard assumptions, and thus, their security has been questioned. An FE combiner attempts to make use of these candidates while minimizing the trust placed on any individual FE candidate. Informally, an FE combiner takes in a set of FE candidates and outputs a secure FE scheme if at least one of the candidates is secure. Another fundamental area in cryptography is secure multi-party computation (MPC), which has been extensively studied for several decades. In this work, we initiate a formal study of the relationship between functional encryption (FE) combiners and secure multi-party computation (MPC). In particular, we show implications in both directions between these primitives. As a consequence of these implications, we obtain the following main results. 1) A two round semi-honest MPC protocol in the plain model secure against up to (n-1) corruptions with communication complexity proportional only to the depth of the circuit being computed assuming LWE. Prior two round protocols that achieved this communication complexity required a common reference string. 2) A functional encryption combiner based on pseudorandom generators (PRGs) in NC^1. Such PRGs can be instantiated from assumptions such as DDH and LWE. Previous constructions of FE combiners were known only from the learning with errors assumption. Using this result, we build a universal construction of functional encryption: an explicit construction of functional encryption based only on the assumptions that functional encryption exists and PRGs in NC^1

    Pre-Constrained Encryption

    Get PDF
    In all existing encryption systems, the owner of the master secret key has the ability to decrypt all ciphertexts. In this work, we propose a new notion of pre-constrained encryption (PCE) where the owner of the master secret key does not have "full" decryption power. Instead, its decryption power is constrained in a pre-specified manner during the system setup. We present formal definitions and constructions of PCE, and discuss societal applications and implications to some well-studied cryptographic primitives

    KEM Combiners

    Get PDF
    Key-encapsulation mechanisms (KEMs) are a common stepping stone for constructing public-key encryption. Secure KEMs can be built from diverse assumptions, including ones related to integer factorization, discrete logarithms, error correcting codes, or lattices. In light of the recent NIST call for post-quantum secure PKE, the zoo of KEMs that are believed to be secure continues to grow. Yet, on the question of which is the most secure KEM opinions are divided. While using the best candidate might actually not seem necessary to survive everyday life situations, placing a wrong bet can actually be devastating, should the employed KEM eventually turn out to be vulnerable. We introduce KEM combiners as a way to garner trust from different KEM constructions, rather than relying on a single one: We present efficient black-box constructions that, given any set of `ingredient\u27 KEMs, yield a new KEM that is (CCA) secure as long as at least one of the ingredient KEMs is. As building blocks our constructions use cryptographic hash functions and blockciphers. Some corresponding security proofs require idealized models for these primitives, others get along on standard assumptions

    Broadcast Secret-Sharing, Bounds and Applications

    Get PDF
    Consider a sender ? and a group of n recipients. ? holds a secret message ? of length l bits and the goal is to allow ? to create a secret sharing of ? with privacy threshold t among the recipients, by broadcasting a single message ? to the recipients. Our goal is to do this with information theoretic security in a model with a simple form of correlated randomness. Namely, for each subset ? of recipients of size q, ? may share a random key with all recipients in ?. (The keys shared with different subsets ? must be independent.) We call this Broadcast Secret-Sharing (BSS) with parameters l, n, t and q. Our main question is: how large must ? be, as a function of the parameters? We show that (n-t)/q l is a lower bound, and we show an upper bound of ((n(t+1)/(q+t)) -t)l, matching the lower bound whenever t = 0, or when q = 1 or n-t. When q = n-t, the size of ? is exactly l which is clearly minimal. The protocol demonstrating the upper bound in this case requires ? to share a key with every subset of size n-t. We show that this overhead cannot be avoided when ? has minimal size. We also show that if access is additionally given to an idealized PRG, the lower bound on ciphertext size becomes (n-t)/q ? + l - negl(?) (where ? is the length of the input to the PRG). The upper bound becomes ((n(t+1))/(q+t) -t)? + l. BSS can be applied directly to secret-key threshold encryption. We can also consider a setting where the correlated randomness is generated using computationally secure and non-interactive key exchange, where we assume that each recipient has an (independently generated) public key for this purpose. In this model, any protocol for non-interactive secret sharing becomes an ad hoc threshold encryption (ATE) scheme, which is a threshold encryption scheme with no trusted setup beyond a PKI. Our upper bounds imply new ATE schemes, and our lower bound becomes a lower bound on the ciphertext size in any ATE scheme that uses a key exchange functionality and no other cryptographic primitives

    On the Complexity of Compressing Obfuscation

    Get PDF
    Indistinguishability obfuscation has become one of the most exciting cryptographic primitives due to its far reaching applications in cryptography and other fields. However, to date, obtaining a plausibly secure construction has been an illusive task, thus motivating the study of seemingly weaker primitives that imply it, with the possibility that they will be easier to construct. In this work, we provide a systematic study of compressing obfuscation, one of the most natural and simple to describe primitives that is known to imply indistinguishability obfuscation when combined with other standard assumptions. A compressing obfuscator is roughly an indistinguishability obfuscator that outputs just a slightly compressed encoding of the truth table. This generalizes notions introduced by Lin et al.~(PKC 2016) and Bitansky et al.~(TCC 2016) by allowing for a broader regime of parameters. We view compressing obfuscation as an independent cryptographic primitive and show various positive and negative results concerning its power and plausibility of existence, demonstrating significant differences from full-fledged indistinguishability obfuscation. First, we show that as a cryptographic building block, compressing obfuscation is weak. In particular, when combined with one-way functions, it cannot be used (in a black-box way) to achieve public-key encryption, even under (sub-)exponential security assumptions. This is in sharp contrast to indistinguishability obfuscation, which together with one-way functions implies almost all cryptographic primitives. Second, we show that to construct compressing obfuscation with perfect correctness, one only needs to assume its existence with a very weak correctness guarantee and polynomial hardness. Namely, we show a correctness amplification transformation with optimal parameters that relies only on polynomial hardness assumptions. This implies a universal construction assuming only polynomially secure compressing obfuscation with approximate correctness. In the context of indistinguishability obfuscation, we know how to achieve such a result only under sub-exponential security assumptions together with derandomization assumptions. Lastly, we characterize the existence of compressing obfuscation with \emph{statistical} security. We show that in some range of parameters and for some classes of circuits such an obfuscator exists, whereas it is unlikely to exist with better parameters or for larger classes of circuits. These positive and negative results reveal a deep connection between compressing obfuscation and various concepts in complexity theory and learning theory
    • …
    corecore