33 research outputs found
Extensions to the Method of Multiplicities, with applications to Kakeya Sets and Mergers
We extend the "method of multiplicities" to get the following results, of
interest in combinatorics and randomness extraction. (A) We show that every
Kakeya set (a set of points that contains a line in every direction) in
\F_q^n must be of size at least . This bound is tight to within a factor for every as , compared to previous bounds
that were off by exponential factors in . (B) We give improved randomness
extractors and "randomness mergers". Mergers are seeded functions that take as
input (possibly correlated) random variables in and a
short random seed and output a single random variable in that is
statistically close to having entropy when one of the
input variables is distributed uniformly. The seed we require is only
-bits long, which significantly improves upon
previous construction of mergers. (C) Using our new mergers, we show how to
construct randomness extractors that use logarithmic length seeds while
extracting fraction of the min-entropy of the source.
The "method of multiplicities", as used in prior work, analyzed subsets of
vector spaces over finite fields by constructing somewhat low degree
interpolating polynomials that vanish on every point in the subset {\em with
high multiplicity}. The typical use of this method involved showing that the
interpolating polynomial also vanished on some points outside the subset, and
then used simple bounds on the number of zeroes to complete the analysis. Our
augmentation to this technique is that we prove, under appropriate conditions,
that the interpolating polynomial vanishes {\em with high multiplicity} outside
the set. This novelty leads to significantly tighter analyses.Comment: 26 pages, now includes extractors with sublinear entropy los
Kakeya sets and the method of multiplicities
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 51-53).We extend the "method of multiplicities" to get the following results, of interest in combinatorics and randomness extraction. 1. We show that every Kakeya set (a set of points that contains a line in every direction) in F' must be of size at least qn/2n. This bound is tight to within a 2 + o(1) factor for every n as q -- oc, compared to previous bounds that were off by exponential factors in n. 2. We give improved randomness extractors and "randomness mergers". Mergers are seeded functions that take as input A (possibly correlated) random variables in {0, 1}N and a short random seed and output a single random variable in {0, 1}N that is statistically close to having entropy (1 - 6) - N when one of the A input variables is distributed uniformly. The seed we require is only (1/6) - log A-bits long, which significantly improves upon previous construction of mergers. 3. Using our new mergers, we show how to construct randomness extractors that use logarithmic length seeds while extracting 1- o(1) fraction of the min-entropy of the source. The "method of multiplicities", as used in prior work, analyzed subsets of vector spaces over finite fields by constructing somewhat low degree interpolating polynomials that vanish on every point in the subset with high multiplicity. The typical use of this method involved showing that the interpolating polynomial also vanished on some points outside the subset, and then used simple bounds on the number of zeroes to complete the analysis. Our augmentation to this technique is that we prove, under appropriate conditions, that the interpolating polynomial vanishes with high multiplicity outside the set. This novelty leads to significantly tighter analyses.by Shubhangi Saraf.S.M
Better short-seed quantum-proof extractors
We construct a strong extractor against quantum storage that works for every
min-entropy , has logarithmic seed length, and outputs bits,
provided that the quantum adversary has at most qubits of memory, for
any \beta < \half. The construction works by first condensing the source
(with minimal entropy-loss) and then applying an extractor that works well
against quantum adversaries when the source is close to uniform.
We also obtain an improved construction of a strong quantum-proof extractor
in the high min-entropy regime. Specifically, we construct an extractor that
uses a logarithmic seed length and extracts bits from any source
over \B^n, provided that the min-entropy of the source conditioned on the
quantum adversary's state is at least , for any \beta < \half.Comment: 14 page
Extracting Mergers and Projections of Partitions
We study the problem of extracting randomness from somewhere-random sources,
and related combinatorial phenomena: partition analogues of Shearer's lemma on
projections.
A somewhere-random source is a tuple of (possibly
correlated) -valued random variables where for some unknown , is guaranteed to be uniformly distributed. An
is a seeded device that takes a somewhere-random source as input and
outputs nearly uniform random bits. We study the seed-length needed for
extracting mergers with constant and constant error. We show:
Just like in the case of standard extractors, seedless extracting
mergers with even just one output bit do not exist.
Unlike the case of standard extractors, it possible to have
extracting mergers that output a constant number of bits using only constant
seed. Furthermore, a random choice of merger does not work for this purpose!
Nevertheless, just like in the case of standard extractors, an
extracting merger which gets most of the entropy out (namely, having
output bits) must have seed. This is the main
technical result of our work, and is proved by a second-moment strengthening of
the graph-theoretic approach of Radhakrishnan and Ta-Shma to extractors.
In contrast, seed-length/output-length tradeoffs for condensing mergers
(where the output is only required to have high min-entropy), can be fully
explained by using standard condensers.
Inspired by such considerations, we also formulate a new and basic class of
problems in combinatorics: partition analogues of Shearer's lemma. We show
basic results in this direction; in particular, we prove that in any partition
of the -dimensional cube into two parts, one of the parts has an
axis parallel -dimensional projection of area at least .Comment: Full version of the paper accepted to the International Conference on
Randomization and Computation (RANDOM) 2023. 28 pages, 2 figure
The method of multiplicities
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 93-98).Polynomials have played a fundamental role in the construction of objects with interesting combinatorial properties, such as error correcting codes, pseudorandom generators and randomness extractors. Somewhat strikingly, polynomials have also been found to be a powerful tool in the analysis of combinatorial parameters of objects that have some algebraic structure. This method of analysis has found applications in works on list-decoding of error correcting codes, constructions of randomness extractors, and in obtaining strong bounds for the size of Kakeya Sets. Remarkably, all these applications have relied on very simple and elementary properties of polynomials such as the sparsity of the zero sets of low degree polynomials. In this thesis we improve on several of the results mentioned above by a more powerful application of polynomials that takes into account the information contained in the derivatives of the polynomials. We call this technique the method of multiplicities. The derivative polynomials encode information about the high multiplicity zeroes of the original polynomial, and by taking into account this information, we are about to meaningfully reason about the zero sets of polynomials of degree much higher than the underlying field size. This freedom of using high degree polynomials allows us to obtain new and improved constructions of error correcting codes, and qualitatively improved analyses of Kakeya sets and randomness extractors.by Shubhangi Saraf.Ph.D
Linear Hashing with guarantees and two-sided Kakeya bounds
We show that a randomly chosen linear map over a finite field gives a good
hash function in the sense. More concretely, consider a set and a randomly chosen linear map with taken to be sufficiently smaller than . Let
denote a random variable distributed uniformly on . Our main theorem
shows that, with high probability over the choice of , the random variable
is close to uniform in the norm. In other words, every
element in the range has about the same number of elements in
mapped to it. This complements the widely-used Leftover Hash Lemma (LHL)
which proves the analog statement under the statistical, or , distance
(for a richer class of functions) as well as prior work on the expected largest
'bucket size' in linear hash functions [ADMPT99]. Our proof leverages a
connection between linear hashing and the finite field Kakeya problem and
extends some of the tools developed in this area, in particular the polynomial
method
Complexity Theory
Computational Complexity Theory is the mathematical study of the intrinsic power and limitations of computational resources like time, space, or randomness. The current workshop focused on recent developments in various sub-areas including arithmetic complexity, Boolean complexity, communication complexity, cryptography, probabilistic proof systems, pseudorandomness, and quantum computation. Many of the developements are related to diverse mathematical fields such as algebraic geometry, combinatorial number theory, probability theory, quantum mechanics, representation theory, and the theory of error-correcting codes