88 research outputs found

    A Survey on Homomorphic Encryption Schemes: Theory and Implementation

    Full text link
    Legacy encryption systems depend on sharing a key (public or private) among the peers involved in exchanging an encrypted message. However, this approach poses privacy concerns. Especially with popular cloud services, the control over the privacy of the sensitive data is lost. Even when the keys are not shared, the encrypted material is shared with a third party that does not necessarily need to access the content. Moreover, untrusted servers, providers, and cloud operators can keep identifying elements of users long after users end the relationship with the services. Indeed, Homomorphic Encryption (HE), a special kind of encryption scheme, can address these concerns as it allows any third party to operate on the encrypted data without decrypting it in advance. Although this extremely useful feature of the HE scheme has been known for over 30 years, the first plausible and achievable Fully Homomorphic Encryption (FHE) scheme, which allows any computable function to perform on the encrypted data, was introduced by Craig Gentry in 2009. Even though this was a major achievement, different implementations so far demonstrated that FHE still needs to be improved significantly to be practical on every platform. First, we present the basics of HE and the details of the well-known Partially Homomorphic Encryption (PHE) and Somewhat Homomorphic Encryption (SWHE), which are important pillars of achieving FHE. Then, the main FHE families, which have become the base for the other follow-up FHE schemes are presented. Furthermore, the implementations and recent improvements in Gentry-type FHE schemes are also surveyed. Finally, further research directions are discussed. This survey is intended to give a clear knowledge and foundation to researchers and practitioners interested in knowing, applying, as well as extending the state of the art HE, PHE, SWHE, and FHE systems.Comment: - Updated. (October 6, 2017) - This paper is an early draft of the survey that is being submitted to ACM CSUR and has been uploaded to arXiv for feedback from stakeholder

    Sanitization of FHE ciphertexts

    Get PDF
    By definition, fully homomorphic encryption (FHE) schemes support homomorphic decryption, and all known FHE constructions are bootstrapped from a Somewhat Homomorphic Encryption (SHE) scheme via this technique. Additionally, when a public key is provided, ciphertexts are also re-randomizable, e.g., by adding to them fresh encryptions of 0. From those two operations we devise an algorithm to sanitize a ciphertext, by making its distribution canonical. In particular, the distribution of the ciphertext does not depend on the circuit that led to it via homomorphic evaluation, thus providing circuit privacy in the honest-but-curious model. Unlike the previous approach based on noise flooding, our approach does not degrade much the security/efficiency trade-off of the underlying FHE. The technique can be applied to all lattice-based FHE proposed so far, without substantially affecting their concrete parameters

    Efficient Fully Homomorphic Encryption from (Standard) LWE

    Get PDF
    A fully homomorphic encryption (FHE) scheme allows anyone to transform an encryption of a message, m, into an encryption of any (efficient) function of that message, f(m), without knowing the secret key. We present a leveled FHE scheme that is based solely on the (standard) learning with errors (LWE) assumption. (Leveled FHE schemes are initialized with a bound on the maximal evaluation depth. However, this restriction can be removed by assuming “weak circular security.”) Applying known results on LWE, the security of our scheme is based on the worst-case hardness of “short vector problems” on arbitrary lattices. Our construction improves on previous works in two aspects: 1. We show that “somewhat homomorphic” encryption can be based on LWE, using a new relinearization technique. In contrast, all previous schemes relied on complexity assumptions related to ideals in various rings. 2. We deviate from the “squashing paradigm” used in all previous works. We introduce a new dimension-modulus reduction technique, which shortens the ciphertexts and reduces the decryption complexity of our scheme, without introducing additional assumptions. Our scheme has very short ciphertexts, and we therefore use it to construct an asymptotically efficient LWE-based single-server private information retrieval (PIR) protocol. The communication complexity of our protocol (in the public-key model) is k·polylog(k)+log |DB| bits per single-bit query, in order to achieve security against 2k-time adversaries (based on the best known attacks against our underlying assumptions). Key words. cryptology, public-key encryption, fully homomorphic encryption, learning with errors, private information retrieva

    Homomorphic Encryption — Theory and Application

    Get PDF

    On the Cryptographic Hardness of Local Search

    Get PDF
    We show new hardness results for the class of Polynomial Local Search problems (PLS): - Hardness of PLS based on a falsifiable assumption on bilinear groups introduced by Kalai, Paneth, and Yang (STOC 2019), and the Exponential Time Hypothesis for randomized algorithms. Previous standard model constructions relied on non-falsifiable and non-standard assumptions. - Hardness of PLS relative to random oracles. The construction is essentially different than previous constructions, and in particular is unconditionally secure. The construction also demonstrates the hardness of parallelizing local search. The core observation behind the results is that the unique proofs property of incrementally-verifiable computations previously used to demonstrate hardness in PLS can be traded with a simple incremental completeness property

    Generalized Learning Problems and Applications to Non-commutative Cryptography

    Get PDF
    Abstract. We propose a generalization of the learning parity with noise (LPN) and learning with errors (LWE) problems to an abstract class of group-theoretic learning problems that we term learning homomorphisms with noise (LHN). This class of problems contains LPN and LWE as spe-cial cases, but is much more general. It allows, for example, instantiations based on non-abelian groups, resulting in a new avenue for the applica-tion of combinatorial group theory to the development of cryptographic primitives. We then study a particular instantiation using relatively free groups and construct a symmetric cryptosystem based upon it

    Non-interactive Universal Arguments

    Get PDF
    In 2002, Barak and Goldreich introduced the notion of a universal argument and constructed an interactive universal argument for non-deterministic computations based on polynomially hard collision-resistant hash functions. Since then, and especially in recent years, there have been tremendous developments in the construction of non-interactive succinct arguments for deterministic computations under standard hardness assumptions. However, the constructed succinct arguments can be proven universal only under sub-exponential assumptions. Assuming polynomially hard fully homomorphic encryption and a widely believed worst-case complexity assumption, we prove a general lifting theorem showing that all existing non-interactive succinct arguments can be made universal. The required complexity assumption is that non-uniformity does not allow arbitrary polynomial speedup. In the setting of uniform adversaries, this extra assumption is not needed

    Improving Additive and Multiplicative Homomorphic Encryption Schemes Based on Worst-Case Hardness Assumptions}

    Get PDF
    In CRYPTO 2010, Aguilar et al. proposed a somewhat homomorphic encryption scheme, i.e. an encryption scheme allowing to compute a limited amount of sums and products over encrypted data, with a security reduction from LWE over general lattices. General lattices (as opposed to ideal lattices) do not have an inherent multiplicative structure but, using a tensorial product, Aguilar et al. managed to obtain a scheme allowing to compute products with a polylogarithmic amount of operands. In this paper we present an alternative construction allowing to compute products with polynomially-many operands while preserving the security reductions of the initial scheme. Unfortunately, despite this improvement our construction seems to be incompatible with Gentry\u27s seminal transformation allowing to obtain fully-homomorphic encryption schemes. Recently, Brakerski et al. used the tensorial product approach introduced by Aguilar et al. in a new alternative way which allows to radically improve the performance of the obtained scheme. Based on this approach, and using two nice optimizations, their scheme is able to evaluate products with exponentially-many operands and can be transformed into an efficient fully-homomorphic encryption scheme while being based on general lattice problems. However, even if these results outperform the construction presented here, we believe the modifications we suggest for Aguilar et al.\u27s schemes are of independent interest

    Noninteractive Zero Knowledge for NP from (Plain) Learning With Errors

    Get PDF
    We finally close the long-standing problem of constructing a noninteractive zero-knowledge (NIZK) proof system for any NP language with security based on the plain Learning With Errors (LWE) problem, and thereby on worst-case lattice problems. Our proof system instantiates the framework recently developed by Canetti et al. [EUROCRYPT\u2718], Holmgren and Lombardi [FOCS\u2718], and Canetti et al. [STOC\u2719] for soundly applying the Fiat--Shamir transform using a hash function family that is correlation intractable for a suitable class of relations. Previously, such hash families were based either on ``exotic\u27\u27 assumptions (e.g., indistinguishability obfuscation or optimal hardness of certain LWE variants) or, more recently, on the existence of circularly secure fully homomorphic encryption (FHE). However, none of these assumptions are known to be implied by plain LWE or worst-case hardness. Our main technical contribution is a hash family that is correlation intractable for arbitrary size-SS circuits, for any polynomially bounded SS, based on plain LWE (with small polynomial approximation factors). The construction combines two novel ingredients: a correlation-intractable hash family for log-depth circuits based on LWE (or even the potentially harder Short Integer Solution problem), and a ``bootstrapping\u27\u27 transform that uses (leveled) FHE to promote correlation intractability for the FHE decryption circuit to arbitrary (bounded) circuits. Our construction can be instantiated in two possible ``modes,\u27\u27 yielding a NIZK that is either computationally sound and statistically zero knowledge in the common random string model, or vice-versa in the common reference string model
    • …
    corecore