2,463 research outputs found
Two-sources Randomness Extractors for Elliptic Curves
This paper studies the task of two-sources randomness extractors for elliptic
curves defined over finite fields , where can be a prime or a binary
field. In fact, we introduce new constructions of functions over elliptic
curves which take in input two random points from two differents subgroups. In
other words, for a ginven elliptic curve defined over a finite field
and two random points and , where and are two subgroups of
, our function extracts the least significant bits of the
abscissa of the point when is a large prime, and the -first
coefficients of the asbcissa of the point when , where is a prime greater than . We show that the extracted bits
are close to uniform.
Our construction extends some interesting randomness extractors for elliptic
curves, namely those defined in \cite{op} and \cite{ciss1,ciss2}, when
. The proposed constructions can be used in any
cryptographic schemes which require extraction of random bits from two sources
over elliptic curves, namely in key exchange protole, design of strong
pseudo-random number generators, etc
Samplers and Extractors for Unbounded Functions
Blasiok (SODA\u2718) recently introduced the notion of a subgaussian sampler, defined as an averaging sampler for approximating the mean of functions f from {0,1}^m to the real numbers such that f(U_m) has subgaussian tails, and asked for explicit constructions. In this work, we give the first explicit constructions of subgaussian samplers (and in fact averaging samplers for the broader class of subexponential functions) that match the best known constructions of averaging samplers for [0,1]-bounded functions in the regime of parameters where the approximation error epsilon and failure probability delta are subconstant. Our constructions are established via an extension of the standard notion of randomness extractor (Nisan and Zuckerman, JCSS\u2796) where the error is measured by an arbitrary divergence rather than total variation distance, and a generalization of Zuckerman\u27s equivalence (Random Struct. Alg.\u2797) between extractors and samplers. We believe that the framework we develop, and specifically the notion of an extractor for the Kullback-Leibler (KL) divergence, are of independent interest. In particular, KL-extractors are stronger than both standard extractors and subgaussian samplers, but we show that they exist with essentially the same parameters (constructively and non-constructively) as standard extractors
Quantum to Classical Randomness Extractors
The goal of randomness extraction is to distill (almost) perfect randomness
from a weak source of randomness. When the source yields a classical string X,
many extractor constructions are known. Yet, when considering a physical
randomness source, X is itself ultimately the result of a measurement on an
underlying quantum system. When characterizing the power of a source to supply
randomness it is hence a natural question to ask, how much classical randomness
we can extract from a quantum system. To tackle this question we here take on
the study of quantum-to-classical randomness extractors (QC-extractors). We
provide constructions of QC-extractors based on measurements in a full set of
mutually unbiased bases (MUBs), and certain single qubit measurements. As the
first application, we show that any QC-extractor gives rise to entropic
uncertainty relations with respect to quantum side information. Such relations
were previously only known for two measurements. As the second application, we
resolve the central open question in the noisy-storage model [Wehner et al.,
PRL 100, 220502 (2008)] by linking security to the quantum capacity of the
adversary's storage device.Comment: 6+31 pages, 2 tables, 1 figure, v2: improved converse parameters,
typos corrected, new discussion, v3: new reference
Efficiently Extracting Randomness from Imperfect Stochastic Processes
We study the problem of extracting a prescribed number of random bits by
reading the smallest possible number of symbols from non-ideal stochastic
processes. The related interval algorithm proposed by Han and Hoshi has
asymptotically optimal performance; however, it assumes that the distribution
of the input stochastic process is known. The motivation for our work is the
fact that, in practice, sources of randomness have inherent correlations and
are affected by measurement's noise. Namely, it is hard to obtain an accurate
estimation of the distribution. This challenge was addressed by the concepts of
seeded and seedless extractors that can handle general random sources with
unknown distributions. However, known seeded and seedless extractors provide
extraction efficiencies that are substantially smaller than Shannon's entropy
limit. Our main contribution is the design of extractors that have a variable
input-length and a fixed output length, are efficient in the consumption of
symbols from the source, are capable of generating random bits from general
stochastic processes and approach the information theoretic upper bound on
efficiency.Comment: 2 columns, 16 page
Trevisan's extractor in the presence of quantum side information
Randomness extraction involves the processing of purely classical information
and is therefore usually studied in the framework of classical probability
theory. However, such a classical treatment is generally too restrictive for
applications, where side information about the values taken by classical random
variables may be represented by the state of a quantum system. This is
particularly relevant in the context of cryptography, where an adversary may
make use of quantum devices. Here, we show that the well known construction
paradigm for extractors proposed by Trevisan is sound in the presence of
quantum side information.
We exploit the modularity of this paradigm to give several concrete extractor
constructions, which, e.g, extract all the conditional (smooth) min-entropy of
the source using a seed of length poly-logarithmic in the input, or only
require the seed to be weakly random.Comment: 20+10 pages; v2: extract more min-entropy, use weakly random seed;
v3: extended introduction, matches published version with sections somewhat
reordere
- …