399 research outputs found
A Cryptographic Proof of Regularity Lemmas: Simpler Unified Proofs and Refined Bounds
In this work we present a short and unified proof for the Strong and Weak Regularity Lemma, based on the cryptographic technique called \emph{low-complexity approximations}. In short, both problems
reduce to a task of finding constructively an approximation for a certain target function under a class of distinguishers (test functions), where distinguishers are combinations of simple rectangle-indicators.
In our case these approximations can be learned by a simple iterative procedure, which yields a unified and simple proof, achieving for any graph with density and any approximation parameter the partition size
\begin{itemize}
\item a tower of 2\u27s of height for a variant of Strong Regularity
\item a power of 2 with exponent for Weak Regularity
\end{itemize}
The novelty in our proof is as follows: (a) a simple approach which yields both strong and weaker variant, and (b)
improvements for sparse graphs.
At an abstract level, our proof can be seen a refinement and simplification of the ``analytic\u27\u27 proof given by Lovasz and Szegedy
Regular and almost universal hashing: an efficient implementation
Random hashing can provide guarantees regarding the performance of data
structures such as hash tables---even in an adversarial setting. Many existing
families of hash functions are universal: given two data objects, the
probability that they have the same hash value is low given that we pick hash
functions at random. However, universality fails to ensure that all hash
functions are well behaved. We further require regularity: when picking data
objects at random they should have a low probability of having the same hash
value, for any fixed hash function. We present the efficient implementation of
a family of non-cryptographic hash functions (PM+) offering good running times,
good memory usage as well as distinguishing theoretical guarantees: almost
universality and component-wise regularity. On a variety of platforms, our
implementations are comparable to the state of the art in performance. On
recent Intel processors, PM+ achieves a speed of 4.7 bytes per cycle for 32-bit
outputs and 3.3 bytes per cycle for 64-bit outputs. We review vectorization
through SIMD instructions (e.g., AVX2) and optimizations for superscalar
execution.Comment: accepted for publication in Software: Practice and Experience in
September 201
Moment-Matching Polynomials
We give a new framework for proving the existence of low-degree, polynomial
approximators for Boolean functions with respect to broad classes of
non-product distributions. Our proofs use techniques related to the classical
moment problem and deviate significantly from known Fourier-based methods,
which require the underlying distribution to have some product structure.
Our main application is the first polynomial-time algorithm for agnostically
learning any function of a constant number of halfspaces with respect to any
log-concave distribution (for any constant accuracy parameter). This result was
not known even for the case of learning the intersection of two halfspaces
without noise. Additionally, we show that in the "smoothed-analysis" setting,
the above results hold with respect to distributions that have sub-exponential
tails, a property satisfied by many natural and well-studied distributions in
machine learning.
Given that our algorithms can be implemented using Support Vector Machines
(SVMs) with a polynomial kernel, these results give a rigorous theoretical
explanation as to why many kernel methods work so well in practice
Credible, Truthful, and Two-Round (Optimal) Auctions via Cryptographic Commitments
We consider the sale of a single item to multiple buyers by a
revenue-maximizing seller. Recent work of Akbarpour and Li formalizes
\emph{credibility} as an auction desideratum, and prove that the only optimal,
credible, strategyproof auction is the ascending price auction with reserves
(Akbarpour and Li, 2019).
In contrast, when buyers' valuations are MHR, we show that the mild
additional assumption of a cryptographically secure commitment scheme suffices
for a simple \emph{two-round} auction which is optimal, strategyproof, and
credible (even when the number of bidders is only known by the auctioneer).
We extend our analysis to the case when buyer valuations are
-strongly regular for any , up to arbitrary
in credibility. Interestingly, we also prove that this construction cannot be
extended to regular distributions, nor can the be removed with
multiple bidders
Dense subsets of pseudorandom sets
No abstract availabl
Project Presentation: Algorithmic Structuring and Compression of Proofs (ASCOP)
International audienceComputer-generated proofs are typically analytic, i.e. they essentially consist only of formulas which are present in the theorem that is shown. In contrast, mathematical proofs written by humans almost never are: they are highly structured due to the use of lemmas. The ASCOP-project aims at developing algorithms and software which structure and abbreviate analytic proofs by computing useful lemmas. These algorithms will be based on recent groundbreaking results establishing a new connection between proof theory and formal language theory. This connection allows the application of e cient algorithms based on formal grammars to structure and compress proofs
- …