132 research outputs found

    When Can Limited Randomness Be Used in Repeated Games?

    Full text link
    The central result of classical game theory states that every finite normal form game has a Nash equilibrium, provided that players are allowed to use randomized (mixed) strategies. However, in practice, humans are known to be bad at generating random-like sequences, and true random bits may be unavailable. Even if the players have access to enough random bits for a single instance of the game their randomness might be insufficient if the game is played many times. In this work, we ask whether randomness is necessary for equilibria to exist in finitely repeated games. We show that for a large class of games containing arbitrary two-player zero-sum games, approximate Nash equilibria of the nn-stage repeated version of the game exist if and only if both players have Ω(n)\Omega(n) random bits. In contrast, we show that there exists a class of games for which no equilibrium exists in pure strategies, yet the nn-stage repeated version of the game has an exact Nash equilibrium in which each player uses only a constant number of random bits. When the players are assumed to be computationally bounded, if cryptographic pseudorandom generators (or, equivalently, one-way functions) exist, then the players can base their strategies on "random-like" sequences derived from only a small number of truly random bits. We show that, in contrast, in repeated two-player zero-sum games, if pseudorandom generators \emph{do not} exist, then Ω(n)\Omega(n) random bits remain necessary for equilibria to exist

    Verifying proofs in constant depth

    Get PDF
    In this paper we initiate the study of proof systems where verification of proofs proceeds by NC circuits. We investigate the question which languages admit proof systems in this very restricted model. Formulated alternatively, we ask which languages can be enumerated by NC functions. Our results show that the answer to this problem is not determined by the complexity of the language. On the one hand, we construct NC proof systems for a variety of languages ranging from regular to NP-complete. On the other hand, we show by combinatorial methods that even easy regular languages such as Exact-OR do not admit NC proof systems. We also present a general construction of proof systems for regular languages with strongly connected NFA's

    Unifying computational entropies via Kullback-Leibler divergence

    Get PDF
    We introduce hardness in relative entropy, a new notion of hardness for search problems which on the one hand is satisfied by all one-way functions and on the other hand implies both next-block pseudoentropy and inaccessible entropy, two forms of computational entropy used in recent constructions of pseudorandom generators and statistically hiding commitment schemes, respectively. Thus, hardness in relative entropy unifies the latter two notions of computational entropy and sheds light on the apparent "duality" between them. Additionally, it yields a more modular and illuminating proof that one-way functions imply next-block inaccessible entropy, similar in structure to the proof that one-way functions imply next-block pseudoentropy (Vadhan and Zheng, STOC '12)

    Predictable arguments of knowledge

    Get PDF
    We initiate a formal investigation on the power of predictability for argument of knowledge systems for NP. Specifically, we consider private-coin argument systems where the answer of the prover can be predicted, given the private randomness of the verifier; we call such protocols Predictable Arguments of Knowledge (PAoK). Our study encompasses a full characterization of PAoK, showing that such arguments can be made extremely laconic, with the prover sending a single bit, and assumed to have only one round (i.e., two messages) of communication without loss of generality. We additionally explore PAoK satisfying additional properties (including zero-knowledge and the possibility of re-using the same challenge across multiple executions with the prover), present several constructions of PAoK relying on different cryptographic tools, and discuss applications to cryptography

    Approximation hardness of Travelling Salesman via weighted amplifiers

    Get PDF
    The expander graph constructions and their variants are the main tool used in gap preserving reductions to prove approximation lower bounds of combinatorial optimisation problems. In this paper we introduce the weighted amplifiers and weighted low occurrence of Constraint Satisfaction problems as intermediate steps in the NP-hard gap reductions. Allowing the weights in intermediate problems is rather natural for the edge-weighted problems as Travelling Salesman or Steiner Tree. We demonstrate the technique for Travelling Salesman and use the parametrised weighted amplifiers in the gap reductions to allow more flexibility in fine-tuning their expanding parameters. The purpose of this paper is to point out effectiveness of these ideas, rather than to optimise the expander’s parameters. Nevertheless, we show that already slight improvement of known expander values modestly improve the current best approximation hardness value for TSP from 123/122 ([9]) to 117/116 . This provides a new motivation for study of expanding properties of random graphs in order to improve approximation lower bounds of TSP and other edge-weighted optimisation problems

    Homomorphic Encryption: From Private-Key to Public-Key

    Full text link
    We show that any private-key encryption scheme that is weakly homomorphic with respect to addition modulo 2, can be transformed into a public-key encryption scheme. The homomorphic feature referred to is a minimalistic one; that is, the length of a homomorphically generated encryption should be independent of the number of ciphertexts from which it was created. We do not require anything else on the distribution of homomorphically generated encryptions (in particular, we do not require them to be distributed like real ciphertexts). Our resulting public-key scheme is homomorphic in the following sense. If i+1 repeated applications of homomorphic operations can be applied to the private-key scheme, then i repeated applications can be applied to the public-key scheme

    On the impossibility of entropy reversal, and its application to zero-knowledge proofs

    Get PDF
    Zero knowledge proof systems have been widely studied in cryptography. In the statistical setting, two classes of proof systems studied are Statistical Zero Knowledge (SZK) and Non-Interactive Statistical Zero Knowledge (NISZK), where the difference is that in NISZK only very limited communication is allowed between the verifier and the prover. It is an open problem whether these two classes are in fact equal. In this paper, we rule out efficient black box reductions between SZK and NISZK. We achieve this by studying algorithms which can reverse the entropy of a function. The problem of estimating the entropy of a circuit is complete for NISZK. Hence, reversing the entropy of a function is equivalent to a black box reduction of NISZK to its complement, which is known to be equivalent to a black box reduction of SZK to NISZK [Goldreich et al, CRYPTO 1999]. We show that any such black box algorithm incurs an exponential loss of parameters, and hence cannot be implemented efficiently

    Adaptive Oblivious Transfer and Generalization

    Get PDF
    International audienceOblivious Transfer (OT) protocols were introduced in the seminal paper of Rabin, and allow a user to retrieve a given number of lines (usually one) in a database, without revealing which ones to the server. The server is ensured that only this given number of lines can be accessed per interaction, and so the others are protected; while the user is ensured that the server does not learn the numbers of the lines required. This primitive has a huge interest in practice, for example in secure multi-party computation, and directly echoes to Symmetrically Private Information Retrieval (SPIR). Recent Oblivious Transfer instantiations secure in the UC framework suf- fer from a drastic fallback. After the first query, there is no improvement on the global scheme complexity and so subsequent queries each have a global complexity of O(|DB|) meaning that there is no gain compared to running completely independent queries. In this paper, we propose a new protocol solving this issue, and allowing to have subsequent queries with a complexity of O(log(|DB|)), and prove the protocol security in the UC framework with adaptive corruptions and reliable erasures. As a second contribution, we show that the techniques we use for Obliv- ious Transfer can be generalized to a new framework we call Oblivi- ous Language-Based Envelope (OLBE). It is of practical interest since it seems more and more unrealistic to consider a database with uncontrolled access in access control scenarii. Our approach generalizes Oblivious Signature-Based Envelope, to handle more expressive credentials and requests from the user. Naturally, OLBE encompasses both OT and OSBE, but it also allows to achieve Oblivious Transfer with fine grain access over each line. For example, a user can access a line if and only if he possesses a certificate granting him access to such line. We show how to generically and efficiently instantiate such primitive, and prove them secure in the Universal Composability framework, with adaptive corruptions assuming reliable erasures. We provide the new UC ideal functionalities when needed, or we show that the existing ones fit in our new framework. The security of such designs allows to preserve both the secrecy of the database values and the user credentials. This symmetry allows to view our new approach as a generalization of the notion of Symmetrically PIR
    • 

    corecore