9,515 research outputs found
A Note on the Entropy/Influence Conjecture
The entropy/influence conjecture, raised by Friedgut and Kalai in 1996, seeks
to relate two different measures of concentration of the Fourier coefficients
of a Boolean function. Roughly saying, it claims that if the Fourier spectrum
is "smeared out", then the Fourier coefficients are concentrated on "high"
levels. In this note we generalize the conjecture to biased product measures on
the discrete cube, and prove a variant of the conjecture for functions with an
extremely low Fourier weight on the "high" levels.Comment: 12 page
Geometric Influences II: Correlation Inequalities and Noise Sensitivity
In a recent paper, we presented a new definition of influences in product
spaces of continuous distributions, and showed that analogues of the most
fundamental results on discrete influences, such as the KKL theorem, hold for
the new definition in Gaussian space. In this paper we prove Gaussian analogues
of two of the central applications of influences: Talagrand's lower bound on
the correlation of increasing subsets of the discrete cube, and the
Benjamini-Kalai-Schramm (BKS) noise sensitivity theorem. We then use the
Gaussian results to obtain analogues of Talagrand's bound for all discrete
probability spaces and to reestablish analogues of the BKS theorem for biased
two-point product spaces.Comment: 20 page
A directed isoperimetric inequality with application to Bregman near neighbor lower bounds
Bregman divergences are a class of divergences parametrized by a
convex function and include well known distance functions like
and the Kullback-Leibler divergence. There has been extensive
research on algorithms for problems like clustering and near neighbor search
with respect to Bregman divergences, in all cases, the algorithms depend not
just on the data size and dimensionality , but also on a structure
constant that depends solely on and can grow without bound
independently.
In this paper, we provide the first evidence that this dependence on
might be intrinsic. We focus on the problem of approximate near neighbor search
for Bregman divergences. We show that under the cell probe model, any
non-adaptive data structure (like locality-sensitive hashing) for
-approximate near-neighbor search that admits probes must use space
. In contrast, for LSH under the best
bound is .
Our new tool is a directed variant of the standard boolean noise operator. We
show that a generalization of the Bonami-Beckner hypercontractivity inequality
exists "in expectation" or upon restriction to certain subsets of the Hamming
cube, and that this is sufficient to prove the desired isoperimetric inequality
that we use in our data structure lower bound.
We also present a structural result reducing the Hamming cube to a Bregman
cube. This structure allows us to obtain lower bounds for problems under
Bregman divergences from their analog. In particular, we get a
(weaker) lower bound for approximate near neighbor search of the form
for an -query non-adaptive data structure,
and new cell probe lower bounds for a number of other near neighbor questions
in Bregman space.Comment: 27 page
Noise stability of functions with low influences: invariance and optimality
In this paper we study functions with low influences on product probability
spaces. The analysis of boolean functions with low influences has become a
central problem in discrete Fourier analysis. It is motivated by fundamental
questions arising from the construction of probabilistically checkable proofs
in theoretical computer science and from problems in the theory of social
choice in economics.
We prove an invariance principle for multilinear polynomials with low
influences and bounded degree; it shows that under mild conditions the
distribution of such polynomials is essentially invariant for all product
spaces. Ours is one of the very few known non-linear invariance principles. It
has the advantage that its proof is simple and that the error bounds are
explicit. We also show that the assumption of bounded degree can be eliminated
if the polynomials are slightly ``smoothed''; this extension is essential for
our applications to ``noise stability''-type problems.
In particular, as applications of the invariance principle we prove two
conjectures: the ``Majority Is Stablest'' conjecture from theoretical computer
science, which was the original motivation for this work, and the ``It Ain't
Over Till It's Over'' conjecture from social choice theory
- …