698 research outputs found

    Derandomization and Group Testing

    Full text link
    The rapid development of derandomization theory, which is a fundamental area in theoretical computer science, has recently led to many surprising applications outside its initial intention. We will review some recent such developments related to combinatorial group testing. In its most basic setting, the aim of group testing is to identify a set of "positive" individuals in a population of items by taking groups of items and asking whether there is a positive in each group. In particular, we will discuss explicit constructions of optimal or nearly-optimal group testing schemes using "randomness-conducting" functions. Among such developments are constructions of error-correcting group testing schemes using randomness extractors and condensers, as well as threshold group testing schemes from lossless condensers.Comment: Invited Paper in Proceedings of 48th Annual Allerton Conference on Communication, Control, and Computing, 201

    Two-sources Randomness Extractors for Elliptic Curves

    Get PDF
    This paper studies the task of two-sources randomness extractors for elliptic curves defined over finite fields KK, where KK can be a prime or a binary field. In fact, we introduce new constructions of functions over elliptic curves which take in input two random points from two differents subgroups. In other words, for a ginven elliptic curve EE defined over a finite field Fq\mathbb{F}_q and two random points PPP \in \mathcal{P} and QQQ\in \mathcal{Q}, where P\mathcal{P} and Q\mathcal{Q} are two subgroups of E(Fq)E(\mathbb{F}_q), our function extracts the least significant bits of the abscissa of the point PQP\oplus Q when qq is a large prime, and the kk-first Fp\mathbb{F}_p coefficients of the asbcissa of the point PQP\oplus Q when q=pnq = p^n, where pp is a prime greater than 55. We show that the extracted bits are close to uniform. Our construction extends some interesting randomness extractors for elliptic curves, namely those defined in \cite{op} and \cite{ciss1,ciss2}, when P=Q\mathcal{P} = \mathcal{Q}. The proposed constructions can be used in any cryptographic schemes which require extraction of random bits from two sources over elliptic curves, namely in key exchange protole, design of strong pseudo-random number generators, etc

    Extensions to the Method of Multiplicities, with applications to Kakeya Sets and Mergers

    Full text link
    We extend the "method of multiplicities" to get the following results, of interest in combinatorics and randomness extraction. (A) We show that every Kakeya set (a set of points that contains a line in every direction) in \F_q^n must be of size at least qn/2nq^n/2^n. This bound is tight to within a 2+o(1)2 + o(1) factor for every nn as qq \to \infty, compared to previous bounds that were off by exponential factors in nn. (B) We give improved randomness extractors and "randomness mergers". Mergers are seeded functions that take as input Λ\Lambda (possibly correlated) random variables in {0,1}N\{0,1\}^N and a short random seed and output a single random variable in {0,1}N\{0,1\}^N that is statistically close to having entropy (1δ)N(1-\delta) \cdot N when one of the Λ\Lambda input variables is distributed uniformly. The seed we require is only (1/δ)logΛ(1/\delta)\cdot \log \Lambda-bits long, which significantly improves upon previous construction of mergers. (C) Using our new mergers, we show how to construct randomness extractors that use logarithmic length seeds while extracting 1o(1)1 - o(1) fraction of the min-entropy of the source. The "method of multiplicities", as used in prior work, analyzed subsets of vector spaces over finite fields by constructing somewhat low degree interpolating polynomials that vanish on every point in the subset {\em with high multiplicity}. The typical use of this method involved showing that the interpolating polynomial also vanished on some points outside the subset, and then used simple bounds on the number of zeroes to complete the analysis. Our augmentation to this technique is that we prove, under appropriate conditions, that the interpolating polynomial vanishes {\em with high multiplicity} outside the set. This novelty leads to significantly tighter analyses.Comment: 26 pages, now includes extractors with sublinear entropy los

    Trevisan's extractor in the presence of quantum side information

    Get PDF
    Randomness extraction involves the processing of purely classical information and is therefore usually studied in the framework of classical probability theory. However, such a classical treatment is generally too restrictive for applications, where side information about the values taken by classical random variables may be represented by the state of a quantum system. This is particularly relevant in the context of cryptography, where an adversary may make use of quantum devices. Here, we show that the well known construction paradigm for extractors proposed by Trevisan is sound in the presence of quantum side information. We exploit the modularity of this paradigm to give several concrete extractor constructions, which, e.g, extract all the conditional (smooth) min-entropy of the source using a seed of length poly-logarithmic in the input, or only require the seed to be weakly random.Comment: 20+10 pages; v2: extract more min-entropy, use weakly random seed; v3: extended introduction, matches published version with sections somewhat reordere

    Efficiently Extracting Randomness from Imperfect Stochastic Processes

    Get PDF
    We study the problem of extracting a prescribed number of random bits by reading the smallest possible number of symbols from non-ideal stochastic processes. The related interval algorithm proposed by Han and Hoshi has asymptotically optimal performance; however, it assumes that the distribution of the input stochastic process is known. The motivation for our work is the fact that, in practice, sources of randomness have inherent correlations and are affected by measurement's noise. Namely, it is hard to obtain an accurate estimation of the distribution. This challenge was addressed by the concepts of seeded and seedless extractors that can handle general random sources with unknown distributions. However, known seeded and seedless extractors provide extraction efficiencies that are substantially smaller than Shannon's entropy limit. Our main contribution is the design of extractors that have a variable input-length and a fixed output length, are efficient in the consumption of symbols from the source, are capable of generating random bits from general stochastic processes and approach the information theoretic upper bound on efficiency.Comment: 2 columns, 16 page

    Noise-Resilient Group Testing: Limitations and Constructions

    Full text link
    We study combinatorial group testing schemes for learning dd-sparse Boolean vectors using highly unreliable disjunctive measurements. We consider an adversarial noise model that only limits the number of false observations, and show that any noise-resilient scheme in this model can only approximately reconstruct the sparse vector. On the positive side, we take this barrier to our advantage and show that approximate reconstruction (within a satisfactory degree of approximation) allows us to break the information theoretic lower bound of Ω~(d2logn)\tilde{\Omega}(d^2 \log n) that is known for exact reconstruction of dd-sparse vectors of length nn via non-adaptive measurements, by a multiplicative factor Ω~(d)\tilde{\Omega}(d). Specifically, we give simple randomized constructions of non-adaptive measurement schemes, with m=O(dlogn)m=O(d \log n) measurements, that allow efficient reconstruction of dd-sparse vectors up to O(d)O(d) false positives even in the presence of δm\delta m false positives and O(m/d)O(m/d) false negatives within the measurement outcomes, for any constant δ<1\delta < 1. We show that, information theoretically, none of these parameters can be substantially improved without dramatically affecting the others. Furthermore, we obtain several explicit constructions, in particular one matching the randomized trade-off but using m=O(d1+o(1)logn)m = O(d^{1+o(1)} \log n) measurements. We also obtain explicit constructions that allow fast reconstruction in time \poly(m), which would be sublinear in nn for sufficiently sparse vectors. The main tool used in our construction is the list-decoding view of randomness condensers and extractors.Comment: Full version. A preliminary summary of this work appears (under the same title) in proceedings of the 17th International Symposium on Fundamentals of Computation Theory (FCT 2009

    Linear Transformations for Randomness Extraction

    Get PDF
    Information-efficient approaches for extracting randomness from imperfect sources have been extensively studied, but simpler and faster ones are required in the high-speed applications of random number generation. In this paper, we focus on linear constructions, namely, applying linear transformation for randomness extraction. We show that linear transformations based on sparse random matrices are asymptotically optimal to extract randomness from independent sources and bit-fixing sources, and they are efficient (may not be optimal) to extract randomness from hidden Markov sources. Further study demonstrates the flexibility of such constructions on source models as well as their excellent information-preserving capabilities. Since linear transformations based on sparse random matrices are computationally fast and can be easy to implement using hardware like FPGAs, they are very attractive in the high-speed applications. In addition, we explore explicit constructions of transformation matrices. We show that the generator matrices of primitive BCH codes are good choices, but linear transformations based on such matrices require more computational time due to their high densities.Comment: 2 columns, 14 page

    Simple extractors via constructions of cryptographic pseudo-random generators

    Full text link
    Trevisan has shown that constructions of pseudo-random generators from hard functions (the Nisan-Wigderson approach) also produce extractors. We show that constructions of pseudo-random generators from one-way permutations (the Blum-Micali-Yao approach) can be used for building extractors as well. Using this new technique we build extractors that do not use designs and polynomial-based error-correcting codes and that are very simple and efficient. For example, one extractor produces each output bit separately in O(log2n)O(\log^2 n) time. These extractors work for weak sources with min entropy λn\lambda n, for arbitrary constant λ>0\lambda > 0, have seed length O(log2n)O(\log^2 n), and their output length is nλ/3\approx n^{\lambda/3}.Comment: 21 pages, an extended abstract will appear in Proc. ICALP 2005; small corrections, some comments and references adde
    corecore