580 research outputs found
Two-Source Condensers with Low Error and Small Entropy Gap via Entropy-Resilient Functions
In their seminal work, Chattopadhyay and Zuckerman (STOC\u2716) constructed a two-source extractor with error epsilon for n-bit sources having min-entropy {polylog}(n/epsilon). Unfortunately, the construction\u27s running-time is {poly}(n/epsilon), which means that with polynomial-time constructions, only polynomially-small errors are possible. Our main result is a {poly}(n,log(1/epsilon))-time computable two-source condenser. For any k >= {polylog}(n/epsilon), our condenser transforms two independent (n,k)-sources to a distribution over m = k-O(log(1/epsilon)) bits that is epsilon-close to having min-entropy m - o(log(1/epsilon)). Hence, achieving entropy gap of o(log(1/epsilon)).
The bottleneck for obtaining low error in recent constructions of two-source extractors lies in the use of resilient functions. Informally, this is a function that receives input bits from r players with the property that the function\u27s output has small bias even if a bounded number of corrupted players feed adversarial inputs after seeing the inputs of the other players. The drawback of using resilient functions is that the error cannot be smaller than ln r/r. This, in return, forces the running time of the construction to be polynomial in 1/epsilon.
A key component in our construction is a variant of resilient functions which we call entropy-resilient functions. This variant can be seen as playing the above game for several rounds, each round outputting one bit. The goal of the corrupted players is to reduce, with as high probability as they can, the min-entropy accumulated throughout the rounds. We show that while the bias decreases only polynomially with the number of players in a one-round game, their success probability decreases exponentially in the entropy gap they are attempting to incur in a repeated game
Recommended from our members
Randomness Conductors and Constant-Degree Lossless Expanders [Extended Abstract]
The main concrete result of this paper is the first explicit construction of constant degree lossless expanders. In these graphs, the expansion factor is almost as large as possible: (1-[epsilon])D, where D is the degree and [epsilon] is an arbitrarily small constant. The best previous explicit constructions gave expansion factor D/2, which is too weak for many applications. The D/2 bound was obtained via the eigenvalue method, and is known that that method cannot give better bounds.
The main abstract contribution of this paper is the introduction and initial study of randomness conductors, a notion which generalizes extractors, expanders, condensers and other similar objects. In all these functions, certain guarantee on the input "entropy" is converted to a guarantee on the output "entropy". For historical reasons, specific objects used specific guarantees of different flavors. We show that the flexibility afforded by the conductor definition leads to interesting combinations of these objects, and to better constructions such as those above. The main technical tool in these constructions is a natural generalization to conductors of the zig-zag graph product, previously defined for expanders and extractors.Engineering and Applied Science
Better lossless condensers through derandomized curve samplers
Lossless condensers are unbalanced expander graphs, with expansion close to optimal. Equivalently, they may be viewed as functions that use a short random seed to map a source on n bits to a source on many fewer bits while preserving all of the min-entropy. It is known how to build lossless condensers when the graphs are slightly unbalanced in the work of M. Capalbo et al. (2002). The highly unbalanced case is also important but the only known construction does not condense the source well. We give explicit constructions of lossless condensers with condensing close to optimal, and using near-optimal seed length. Our main technical contribution is a randomness-efficient method for sampling FD (where F is a field) with low-degree curves. This problem was addressed before in the works of E. Ben-Sasson et al. (2003) and D. Moshkovitz and R. Raz (2006) but the solutions apply only to degree one curves, i.e., lines. Our technique is new and elegant. We use sub-sampling and obtain our curve samplers by composing a sequence of low-degree manifolds, starting with high-dimension, low-degree manifolds and proceeding through lower and lower dimension manifolds with (moderately) growing degrees, until we finish with dimension-one, low-degree manifolds, i.e., curves. The technique may be of independent interest
Recommended from our members
Extracting Randomness from Samplable Distributions
The standard notion of a randomness extractor is a procedure which converts any weak source of randomness into an almost uniform distribution. The conversion necessarily uses a small amount of pure randomness, which can be eliminated by complete enumeration in some, but not all, applications.
Here, we consider the problem of deterministically converting a weak source of randomness into an almost uniform distribution. Previously, deterministic extraction procedures were known only for sources satisfying strong independence requirements. In this paper, we look at sources which are samplable, i.e., can be generated by an efficient sampling algorithm. We seek an efficient deterministic procedure that, given a sample from any samplable distribution of sufficiently large min-entropy, gives an almost uniformly distributed output. We explore the conditions under which such deterministic extractors exist.
We observe that no deterministic extractor exists if the sampler is allowed to use more computational resources than the extractor. On the other hand, if the extractor is allowed (polynomially) more resources than the sampler, we show that deterministic extraction becomes possible. This is true unconditionally in the nonuniform setting (i.e., when the extractor can be computed by a small circuit), and (necessarily) relies on complexity assumptions in the uniform setting.
One of our uniform constructions is as follows: assuming that there are problems in E=DTIME(2^{{O(n)}) that are not solvable by subexponential-size circuits with Sigma_6 gates, there is an efficient extractor that transforms any samplable distribution of length n and min-entropy (1-gamma)n into an output distribution of length (1-O(gamma))n, where gamma is any sufficiently small constant. The running time of the extractor is polynomial in n and the circuit complexity of the sampler. These extractors are based on a connection between deterministic extraction from samplable distributions and hardness against nondeterministic circuits, and on the use of nondeterminism to substantially speed up "list decoding" algorithms for error-correcting codes such as multivariate polynomial codes and Hadamard-like codes.Engineering and Applied Science
Quantum-proof randomness extractors via operator space theory
Quantum-proof randomness extractors are an important building block for
classical and quantum cryptography as well as device independent randomness
amplification and expansion. Furthermore they are also a useful tool in quantum
Shannon theory. It is known that some extractor constructions are quantum-proof
whereas others are provably not [Gavinsky et al., STOC'07]. We argue that the
theory of operator spaces offers a natural framework for studying to what
extent extractors are secure against quantum adversaries: we first phrase the
definition of extractors as a bounded norm condition between normed spaces, and
then show that the presence of quantum adversaries corresponds to a completely
bounded norm condition between operator spaces. From this we show that very
high min-entropy extractors as well as extractors with small output are always
(approximately) quantum-proof. We also study a generalization of extractors
called randomness condensers. We phrase the definition of condensers as a
bounded norm condition and the definition of quantum-proof condensers as a
completely bounded norm condition. Seeing condensers as bipartite graphs, we
then find that the bounded norm condition corresponds to an instance of a well
studied combinatorial problem, called bipartite densest subgraph. Furthermore,
using the characterization in terms of operator spaces, we can associate to any
condenser a Bell inequality (two-player game) such that classical and quantum
strategies are in one-to-one correspondence with classical and quantum attacks
on the condenser. Hence, we get for every quantum-proof condenser (which
includes in particular quantum-proof extractors) a Bell inequality that can not
be violated by quantum mechanics.Comment: v3: 34 pages, published versio
The Bounded Storage Model in The Presence of a Quantum Adversary
An extractor is a function E that is used to extract randomness. Given an
imperfect random source X and a uniform seed Y, the output E(X,Y) is close to
uniform. We study properties of such functions in the presence of prior quantum
information about X, with a particular focus on cryptographic applications. We
prove that certain extractors are suitable for key expansion in the bounded
storage model where the adversary has a limited amount of quantum memory. For
extractors with one-bit output we show that the extracted bit is essentially
equally secure as in the case where the adversary has classical resources. We
prove the security of certain constructions that output multiple bits in the
bounded storage model.Comment: 13 pages Latex, v3: discussion of independent randomizers adde
- …