1,391 research outputs found

    Randomness Extractors -- An Exposition

    Get PDF
    Randomness is crucial to computer science, both in theory and applications. In complexity theory, randomness augments computers to offer more powerful models. In cryptography, randomness is essential for seed generation, where the computational model used is generally probabilistic. However, ideal randomness, which is usually assumed to be available in computer science theory and applications, might not be available to real systems. Randomness extractors are objects that turn “weak” randomness into almost “ideal” randomness (pseudorandomness). In this paper, we will build the framework to work with such objects and present explicit constructions. We will discuss a well-known construction of seeded extractors via universal hashing and present a simple argument to extend such results to two-source extractors

    Semidefinite Programs for Randomness Extractors

    Get PDF
    Randomness extractors are an important building block for classical and quantum cryptography. However, for many applications it is crucial that the extractors are quantum-proof, i.e., that they work even in the presence of quantum adversaries. In general, quantum-proof extractors are poorly understood and we would like to argue that in the same way as Bell inequalities (multiprover games) and communication complexity, the setting of randomness extractors provides a operationally useful framework for studying the power and limitations of a quantum memory compared to a classical one. We start by recalling how to phrase the extractor property as a quadratic program with linear constraints. We then construct a semidefinite programming (SDP) relaxation for this program that is tight for some extractor constructions. Moreover, we show that this SDP relaxation is even sufficient to certify quantum-proof extractors. This gives a unifying approach to understand the stability properties of extractors against quantum adversaries. Finally, we analyze the limitations of this SDP relaxation

    Derandomization and Group Testing

    Full text link
    The rapid development of derandomization theory, which is a fundamental area in theoretical computer science, has recently led to many surprising applications outside its initial intention. We will review some recent such developments related to combinatorial group testing. In its most basic setting, the aim of group testing is to identify a set of "positive" individuals in a population of items by taking groups of items and asking whether there is a positive in each group. In particular, we will discuss explicit constructions of optimal or nearly-optimal group testing schemes using "randomness-conducting" functions. Among such developments are constructions of error-correcting group testing schemes using randomness extractors and condensers, as well as threshold group testing schemes from lossless condensers.Comment: Invited Paper in Proceedings of 48th Annual Allerton Conference on Communication, Control, and Computing, 201

    Trevisan's extractor in the presence of quantum side information

    Get PDF
    Randomness extraction involves the processing of purely classical information and is therefore usually studied in the framework of classical probability theory. However, such a classical treatment is generally too restrictive for applications, where side information about the values taken by classical random variables may be represented by the state of a quantum system. This is particularly relevant in the context of cryptography, where an adversary may make use of quantum devices. Here, we show that the well known construction paradigm for extractors proposed by Trevisan is sound in the presence of quantum side information. We exploit the modularity of this paradigm to give several concrete extractor constructions, which, e.g, extract all the conditional (smooth) min-entropy of the source using a seed of length poly-logarithmic in the input, or only require the seed to be weakly random.Comment: 20+10 pages; v2: extract more min-entropy, use weakly random seed; v3: extended introduction, matches published version with sections somewhat reordere

    Variations on Classical and Quantum Extractors

    Get PDF
    Many constructions of randomness extractors are known to work in the presence of quantum side information, but there also exist extractors which do not [Gavinsky {\it et al.}, STOC'07]. Here we find that spectral extractors ψ\psi with a bound on the second largest eigenvalue λ2(ψψ)\lambda_{2}(\psi^{\dagger}\circ\psi) are quantum-proof. We then discuss fully quantum extractors and call constructions that also work in the presence of quantum correlations decoupling. As in the classical case we show that spectral extractors are decoupling. The drawback of classical and quantum spectral extractors is that they always have a long seed, whereas there exist classical extractors with exponentially smaller seed size. For the quantum case, we show that there exists an extractor with extremely short seed size d=O(log(1/ϵ))d=O(\log(1/\epsilon)), where ϵ>0\epsilon>0 denotes the quality of the randomness. In contrast to the classical case this is independent of the input size and min-entropy and matches the simple lower bound dlog(1/ϵ)d\geq\log(1/\epsilon).Comment: 7 pages, slightly enhanced IEEE ISIT submission including all the proof

    Two-sources Randomness Extractors for Elliptic Curves

    Get PDF
    This paper studies the task of two-sources randomness extractors for elliptic curves defined over finite fields KK, where KK can be a prime or a binary field. In fact, we introduce new constructions of functions over elliptic curves which take in input two random points from two differents subgroups. In other words, for a ginven elliptic curve EE defined over a finite field Fq\mathbb{F}_q and two random points PPP \in \mathcal{P} and QQQ\in \mathcal{Q}, where P\mathcal{P} and Q\mathcal{Q} are two subgroups of E(Fq)E(\mathbb{F}_q), our function extracts the least significant bits of the abscissa of the point PQP\oplus Q when qq is a large prime, and the kk-first Fp\mathbb{F}_p coefficients of the asbcissa of the point PQP\oplus Q when q=pnq = p^n, where pp is a prime greater than 55. We show that the extracted bits are close to uniform. Our construction extends some interesting randomness extractors for elliptic curves, namely those defined in \cite{op} and \cite{ciss1,ciss2}, when P=Q\mathcal{P} = \mathcal{Q}. The proposed constructions can be used in any cryptographic schemes which require extraction of random bits from two sources over elliptic curves, namely in key exchange protole, design of strong pseudo-random number generators, etc

    Linear extractors for extracting randomness from noisy sources

    Get PDF
    Linear transformations have many applications in information theory, like data compression and error-correcting codes design. In this paper, we study the power of linear transformations in randomness extraction, namely linear extractors, as another important application. Comparing to most existing methods for randomness extraction, linear extractors (especially those constructed with sparse matrices) are computationally fast and can be simply implemented with hardware like FPGAs, which makes them very attractive in practical use. We mainly focus on simple, efficient and sparse constructions of linear extractors. Specifically, we demonstrate that random matrices can generate random bits very efficiently from a variety of noisy sources, including noisy coin sources, bit-fixing sources, noisy (hidden) Markov sources, as well as their mixtures. It shows that low-density random matrices have almost the same efficiency as high-density random matrices when the input sequence is long, which provides a way to simplify hardware/software implementation. Note that although we constructed matrices with randomness, they are deterministic (seedless) extractors - once we constructed them, the same construction can be used for any number of times without using any seeds. Another way to construct linear extractors is based on generator matrices of primitive BCH codes. This method is more explicit, but less practical due to its computational complexity and dimensional constraints

    Samplers and Extractors for Unbounded Functions

    Get PDF
    Blasiok (SODA\u2718) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f from {0,1}^m to the real numbers such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error epsilon and failure probability delta are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS\u2796) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman\u27s equivalence (Random Struct. Alg.\u2797) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors

    Quantum to Classical Randomness Extractors

    Full text link
    The goal of randomness extraction is to distill (almost) perfect randomness from a weak source of randomness. When the source yields a classical string X, many extractor constructions are known. Yet, when considering a physical randomness source, X is itself ultimately the result of a measurement on an underlying quantum system. When characterizing the power of a source to supply randomness it is hence a natural question to ask, how much classical randomness we can extract from a quantum system. To tackle this question we here take on the study of quantum-to-classical randomness extractors (QC-extractors). We provide constructions of QC-extractors based on measurements in a full set of mutually unbiased bases (MUBs), and certain single qubit measurements. As the first application, we show that any QC-extractor gives rise to entropic uncertainty relations with respect to quantum side information. Such relations were previously only known for two measurements. As the second application, we resolve the central open question in the noisy-storage model [Wehner et al., PRL 100, 220502 (2008)] by linking security to the quantum capacity of the adversary's storage device.Comment: 6+31 pages, 2 tables, 1 figure, v2: improved converse parameters, typos corrected, new discussion, v3: new reference
    corecore