7 research outputs found

    Statically Aggregate Verifiable Random Functions and Application to E-Lottery

    Get PDF
    Cohen, Goldwasser, and Vaikuntanathan (TCC\u2715) introduced the concept of aggregate pseudo-random functions (PRFs), which allow efficiently computing the aggregate of PRF values over exponential-sized sets. In this paper, we explore the aggregation augmentation on verifiable random function (VRFs), introduced by Micali, Rabin and Vadhan (FOCS\u2799), as well as its application to e-lottery schemes. We introduce the notion of static aggregate verifiable random functions (Agg-VRFs), which perform aggregation for VRFs in a static setting. Our contributions can be summarized as follows: (1) we define static aggregate VRFs, which allow the efficient aggregation of VRF values and the corresponding proofs over super-polynomially large sets; (2) we present a static Agg-VRF construction over bit-fixing sets with respect to product aggregation based on the q-decisional Diffie-Hellman exponent assumption; (3) we test the performance of our static Agg-VRFs instantiation in comparison to a standard (non-aggregate) VRF in terms of costing time for the aggregation and verification processes, which shows that Agg-VRFs lower considerably the timing of verification of big sets; and (4) by employing Agg-VRFs, we propose an improved e-lottery scheme based on the framework of Chow et al.\u27s VRF-based e-lottery proposal (ICCSA\u2705). We evaluate the performance of Chow et al.\u27s e-lottery scheme and our improved scheme, and the latter shows a significant improvement in the efficiency of generating the winning number and the player verification

    Adaptive-Secure VRFs with Shorter Keys from Static Assumptions

    Get PDF
    Verifiable random functions are pseudorandom functions producing publicly verifiable proofs for their outputs, allowing for efficient checks of the correctness of their computation. In this work, we introduce a new computational hypothesis, the n-Eigen-Value assumption, which can be seen as a relaxation of the U_n-MDDH assumption, and prove its equivalence with the n-Rank assumption. Based on the newly introduced computational hypothesis, we build the core of a verifiable random function having an exponentially large input space and reaching adaptive security under a static assumption. The final construction achieves shorter public and secret keys compared to the existing schemes reaching the same properties

    Constructing Verifiable Random Functions with Large Input Spaces

    No full text
    We present a family of verifiable random functions which are provably secure for exponentially-large input spaces under a non-interactive complexity assumption. Prior constructions required either an interactive complexity assumption or one that could tolerate a factor 2 n security loss for n-bit inputs. Our construction is practical and inspired by the pseudorandom functions of Naor and Reingold and the verifiable random functions of Lysyanskaya. Set in a bilinear group, where the Decisional Diffie-Hellman problem is easy to solve, we require the â„“-Decisional Diffie-Hellman Exponent assumption in the standard model, without a common reference string. Our core idea is to apply a simulation technique where the large space of VRF inputs is collapsed into a small (polynomial-size) input in the view of the reduction algorithm. This view, however, is information-theoretically hidden from the attacker. Since the input space is exponentially large, we can first apply a collision-resistant hash function to handle arbitrarily-large inputs.

    Cryptography based on the Hardness of Decoding

    Get PDF
    This thesis provides progress in the fields of for lattice and coding based cryptography. The first contribution consists of constructions of IND-CCA2 secure public key cryptosystems from both the McEliece and the low noise learning parity with noise assumption. The second contribution is a novel instantiation of the lattice-based learning with errors problem which uses uniform errors

    Primary-Secondary-Resolver Membership Proof Systems

    Get PDF
    We consider Primary-Secondary-Resolver Membership Proof Systems (PSR for short) and show different constructions of that primitive. A PSR system is a 3-party protocol, where we have a primary, which is a trusted party which commits to a set of members and their values, then generates a public and secret keys in order for secondaries (provers with knowledge of both keys) and resolvers (verifiers who only know the public key) to engage in interactive proof sessions regarding elements in the universe and their values. The motivation for such systems is for constructing a secure Domain Name System (DNSSEC) that does not reveal any unnecessary information to its clients. We require our systems to be complete, so honest executions will result in correct conclusions by the resolvers, sound, so malicious secondaries cannot cheat resolvers, and zero-knowledge, so resolvers will not learn additional information about elements they did not query explicitly. Providing proofs of membership is easy, as the primary can simply precompute signatures over all the members of the set. Providing proofs of non-membership, i.e. a denial-of-existence mechanism, is trickier and is the main issue in constructing PSR systems. We provide three different strategies to construct a denial of existence mechanism. The first uses a set of cryptographic keys for all elements of the universe which are not members, which we implement using hierarchical identity based encryption and a tree based signature scheme. The second construction uses cuckoo hashing with a stash, where in order to prove non-membership, a secondary must prove that a search for it will fail, i.e. that it is not in the tables or the stash of the cuckoo hashing scheme. The third uses a verifiable ``random looking\u27\u27 function which the primary evaluates over the set of members, then signs the values lexicographically and secondaries then use those signatures to prove to resolvers that the value of the non-member was not signed by the primary. We implement this function using a weaker variant of verifiable random/unpredictable functions and pseudorandom functions with interactive zero knowledge proofs. For all three constructions we suggest fairly efficient implementations, of order comparable to other public-key operations such as signatures and encryption. The first approach offers perfect ZK and does not reveal the size of the set in question, the second can be implemented based on very solid cryptographic assumptions and uses the unique structure of cuckoo hashing, while the last technique has the potential to be highly efficient, if one could construct an efficient and secure VRF/VUF or if one is willing to live in the random oracle model

    Enabling Machine-aided Cryptographic Design

    Get PDF
    The design of cryptographic primitives such as digital signatures and public-key encryption is very often a manual process conducted by expert cryptographers. This persists despite the fact that many new generic or semi-generic methods have been proposed to construct new primitives by transforming existing ones in interesting ways. However, manually applying transformations to existing primitives can be error-prone, ad-hoc and tedious. A natural question is whether automating the process of applying cryptographic transformations would yield competitive or better results? In this thesis, we explore a compiler-based approach for automatically performing certain cryptographic designs. Similar approaches have been applied to various types of cryptographic protocol design with compelling results. We extend this same approach and show that it also can be effective towards automatically applying cryptographic transformations. We first present our extensible architecture that automates a class of cryptographic transformations on primitives. We then propose several techniques that address the aforementioned question including the Charm cryptographic framework, which enables rapid prototyping of cryptographic primitives from abstract descriptions. We build on this work and show the extent to which transformations can be performed automatically given these descriptions. To illustrate this automation, we present a series of cryptographic tools that demonstrate the effectiveness of our automated approach. Our contributions are listed as follows: - AutoBatch: Batch verification is a transformation that improves signature verification time by efficiently processing many signatures at once. Historically, this manual process has been prone to error and tedious for practitioners. We describe the design of an automated tool that finds efficient batch verification algorithms from abstract descriptions of signature schemes. - AutoGroup: Cryptographers often prefer to describe their pairing-based constructions using symmetric group notation for simplicity, while they prefer asymmetric groups for implementation due to the efficiency gains. The symmetric- to-asymmetric translation is usually performed through manual analysis of a scheme and finding an efficient translation that suits applications can be quite challenging. We present an automated tool that uses SMT solvers to find efficient asymmetric translations from abstract descriptions of cryptographic schemes. - AutoStrong: Strongly unforgeable signatures are desired in practice for a variety of cryptographic protocols. Several transformations exist in the literature that show how to obtain strongly unforgeable signatures from existentially unforgeable ones. We focus on a particular highly-efficient transformation due to Boneh, Shen and Waters that is applicable if the signature satisfies a notion of partitioning. Checking for this property can be challenging and has been less explored in the literature. We present an automated tool that also utilizes SMT solvers to determine when this property is applicable for constructing efficient strongly unforgeable signatures from abstract descriptions. We anticipate that these proof-of-concept tools embody the notion that certain cryptographic transformations can be safely and effectively outsourced to machines
    corecore