65 research outputs found

    Cryptography based on the Hardness of Decoding

    Get PDF
    This thesis provides progress in the fields of for lattice and coding based cryptography. The first contribution consists of constructions of IND-CCA2 secure public key cryptosystems from both the McEliece and the low noise learning parity with noise assumption. The second contribution is a novel instantiation of the lattice-based learning with errors problem which uses uniform errors

    Low Noise LPN: KDM Secure Public Key Encryption and Sample Amplification

    Get PDF
    Cryptographic schemes based on the Learning Parity with Noise (LPN) problem have several very desirable aspects: Low computational overhead, simple implementation and conjectured post-quantum hardness. Choosing the LPN noise parameter sufficiently low allows for public key cryptography. In this work, we construct the first standard model public key encryption scheme with key dependent message security based solely on the low noise LPN problem. Additionally, we establish a new connection between LPN with a bounded number of samples and LPN with an unbounded number of samples. In essence, we show that if LPN with a small error and a small number of samples is hard, then LPN with a slightly larger error and an unbounded number of samples is also hard. The key technical ingredient to establish both results is a variant of the LPN problem called the extended LPN problem

    A CCA2 Secure Variant of the McEliece Cryptosystem

    Get PDF
    The McEliece public-key encryption scheme has become an interesting alternative to cryptosystems based on number-theoretical problems. Differently from RSA and ElGa- mal, McEliece PKC is not known to be broken by a quantum computer. Moreover, even tough McEliece PKC has a relatively big key size, encryption and decryption operations are rather efficient. In spite of all the recent results in coding theory based cryptosystems, to the date, there are no constructions secure against chosen ciphertext attacks in the standard model - the de facto security notion for public-key cryptosystems. In this work, we show the first construction of a McEliece based public-key cryptosystem secure against chosen ciphertext attacks in the standard model. Our construction is inspired by a recently proposed technique by Rosen and Segev

    Maliciously Circuit-Private FHE from Information-Theoretic Principles

    Get PDF
    Fully homomorphic encryption (FHE) allows arbitrary computations on encrypted data. The standard security requirement, IND-CPA security, ensures that the encrypted data remain private. However, it does not guarantee privacy for the computation performed on the encrypted data. Statistical circuit privacy offers a strong privacy guarantee for the computation process, namely that a homomorphically evaluated ciphertext does not leak any information on how the result of the computation was obtained. Malicious statistical circuit privacy requires this to hold even for maliciously generated keys and ciphertexts. Ostrovsky, Paskin and Paskin (CRYPTO 2014) constructed an FHE scheme achieving malicious statistical circuit privacy. Their construction, however, makes non-black-box use of a specific underlying FHE scheme, resulting in a circuit-private scheme with inherently high overhead. This work presents a conceptually different construction of maliciously circuit-private FHE from simple information-theoretical principles. Furthermore, our construction only makes black-box use of the underlying FHE scheme, opening the possibility of achieving practically efficient schemes. Finally, in contrast to the OPP scheme in our scheme, pre- and post-homomorphic ciphertexts are syntactically the same, enabling new applications in multi-hop settings

    On The Black-Box Complexity of Correlation Intractability

    Get PDF
    Correlation intractability is an emerging cryptographic paradigm that enabled several recent breakthroughs in establishing soundness of the Fiat-Shamir transform and, consequently, basing non-interactive zero-knowledge proofs and succinct arguments on standard cryptographic assumptions. In a nutshell, a hash family is said to be \emph{correlation intractable} for a class of relations R\mathcal{R} if, for any relation RRR\in\mathcal{R}, it is hard given a random hash function hHh\gets H to find an input zz s.t. (z,h(z))R(z,h(z))\in R, namely a correlation. Despite substantial progress in constructing correlation intractable hash functions, all constructions known to date are based on highly-structured hardness assumptions and, further, are of complexity scaling with the circuit complexity of the target relation class. In this work, we initiate the study of the barriers for building correlation intractability. Our main result is a lower bound on the complexity of any black-box construction of CIH from collision resistant hash (CRH), or one-way permutations (OWP), for any sufficiently expressive relation class. In particular, any such construction for a class of relations with circuit complexity tt must make at least Ω(t)\Omega(t) invocations of the underlying building block. We see this as a first step in developing a methodology towards broader lower bounds

    Two-Round Oblivious Linear Evaluation from Learning with Errors

    Get PDF
    Oblivious Linear Evaluation (OLE) is the arithmetic analogue of the well-know oblivious transfer primitive. It allows a sender, holding an affine function f(x)=a+bxf(x)=a+bx over a finite field or ring, to let a receiver learn f(w)f(w) for a ww of the receiver\u27s choice. In terms of security, the sender remains oblivious of the receiver\u27s input ww, whereas the receiver learns nothing beyond f(w)f(w) about ff. In recent years, OLE has emerged as an essential building block to construct efficient, reusable and maliciously-secure two-party computation. In this work, we present efficient two-round protocols for OLE over large fields based on the Learning with Errors (LWE) assumption, providing a full arithmetic generalization of the oblivious transfer protocol of Peikert, Vaikuntanathan and Waters (CRYPTO 2008). At the technical core of our work is a novel extraction technique which allows to determine if a non-trivial multiple of some vector is close to a qq-ary lattice

    A Framework for Statistically Sender Private OT with Optimal Rate

    Get PDF
    Statistical sender privacy (SSP) is the strongest achievable security notion for two-message oblivious transfer (OT) in the standard model, providing statistical security against malicious receivers and computational security against semi-honest senders. In this work we provide a novel construction of SSP OT from the Decisional Diffie-Hellman (DDH) and the Learning Parity with Noise (LPN) assumptions achieving (asymptotically) optimal amortized communication complexity, i.e. it achieves rate 1. Concretely, the total communication complexity for kk OT instances is 2k(1+o(1))2k(1+o(1)), which (asymptotically) approaches the information-theoretic lower bound. Previously, it was only known how to realize this primitive using heavy rate-1 FHE techniques [Brakerski et al., Gentry and Halevi TCC\u2719]. At the heart of our construction is a primitive called statistical co-PIR, essentially a a public key encryption scheme which statistically erases bits of the message in a few hidden locations. Our scheme achieves nearly optimal ciphertext size and provides statistical security against malicious receivers. Computational security against semi-honest senders holds under the DDH assumption

    Multiparty Cardinality Testing for Threshold Private Set Intersection

    Get PDF
    Threshold Private Set Intersection (PSI) allows multiple parties to compute the intersection of their input sets if and only if the intersection is larger than ntn-t, where nn is the size of the sets and tt is some threshold. The main appeal of this primitive is that, in contrast to standard PSI, known upper-bounds on the communication complexity only depend on the threshold tt and not on the sizes of the input sets. Current Threshold PSI protocols split themselves into two components: A Cardinality Testing phase, where parties decide if the intersection is larger than some threshold; and a PSI phase, where the intersection is computed. The main source of inefficiency of Threshold PSI is the former part. In this work, we present a new Cardinality Testing protocol that allows NN parties to check if the intersection of their input sets is larger than ntn-t. The protocol incurs in O~(Nt2)\tilde{ \mathcal{O}} (Nt^2) communication complexity. We thus obtain a Threshold PSI scheme for NN parties with communication complexity O~(Nt2)\tilde{ \mathcal{O}}(Nt^2)

    Rate-1 Incompressible Encryption from Standard Assumptions

    Get PDF
    Incompressible encryption, recently proposed by Guan, Wichs and Zhandry (EUROCRYPT\u2722), is a novel encryption paradigm geared towards providing strong long-term security guarantees against adversaries with bounded long-term memory. Given that the adversary forgets just a small fraction of a ciphertext, this notion provides strong security for the message encrypted therein, even if, at some point in the future, the entire secret key is exposed. This comes at the price of having potentially very large ciphertexts. Thus, an important efficiency measure for incompressible encryption is the message-to-ciphertext ratio (also called the rate). Guan et al. provided a low-rate instantiation of this notion from standard assumptions and a rate-1 instantiation from indistinguishability obfuscation (iO). In this work, we propose a simple framework to build rate-1 incompressible encryption from standard assumptions. Our construction can be realized from, e.g. the DDH and additionally the DCR or the LWE assumptions
    corecore