145 research outputs found
An Atypical Survey of Typical-Case Heuristic Algorithms
Heuristic approaches often do so well that they seem to pretty much always
give the right answer. How close can heuristic algorithms get to always giving
the right answer, without inducing seismic complexity-theoretic consequences?
This article first discusses how a series of results by Berman, Buhrman,
Hartmanis, Homer, Longpr\'{e}, Ogiwara, Sch\"{o}ening, and Watanabe, from the
early 1970s through the early 1990s, explicitly or implicitly limited how well
heuristic algorithms can do on NP-hard problems. In particular, many desirable
levels of heuristic success cannot be obtained unless severe, highly unlikely
complexity class collapses occur. Second, we survey work initiated by Goldreich
and Wigderson, who showed how under plausible assumptions deterministic
heuristics for randomized computation can achieve a very high frequency of
correctness. Finally, we consider formal ways in which theory can help explain
the effectiveness of heuristics that solve NP-hard problems in practice.Comment: This article is currently scheduled to appear in the December 2012
issue of SIGACT New
Counting Steps: A Finitist Approach to Objective Probability in Physics
We propose a new interpretation of objective probability in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set-up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) between a physical state and another, and (3) the size of the set of time-complexity functions that are compatible with the physical resources required to reach a physical state from another. This view (a) exorcises 'ignorance' from statistical physics, and (b) underlies a new interpretation to non-relativistic quantum mechanics
Average-Case Hardness of Proving Tautologies and Theorems
We consolidate two widely believed conjectures about tautologies -- no
optimal proof system exists, and most require superpolynomial size proofs in
any system -- into a -isomorphism-invariant condition satisfied by all
paddable -complete languages or none. The condition is: for any
Turing machine (TM) accepting the language, -uniform input
families requiring superpolynomial time by exist (equivalent to the first
conjecture) and appear with positive upper density in an enumeration of input
families (implies the second). In that case, no such language is easy on
average (in ) for a distribution applying non-negligible weight
to the hard families.
The hardness of proving tautologies and theorems is likely related. Motivated
by the fact that arithmetic sentences encoding "string is Kolmogorov
random" are true but unprovable with positive density in a finitely axiomatized
theory (Calude and J{\"u}rgensen), we conjecture that any
propositional proof system requires superpolynomial size proofs for a dense set
of -uniform families of tautologies encoding "there is no
proof of size showing that string is Kolmogorov
random". This implies the above condition.
The conjecture suggests that there is no optimal proof system because
undecidable theories help prove tautologies and do so more efficiently as
axioms are added, and that constructing hard tautologies seems difficult
because it is impossible to construct Kolmogorov random strings. Similar
conjectures that computational blind spots are manifestations of
noncomputability would resolve other open problems
Topology and Order
We will discuss topologies as orders, orders on sets of topologies, and topologies on ordered sets. More specifically, we will discuss Alexandroff topologies as quasiorders, the lattice of topologies on a finite set, and partially ordered topological spaces. Some topological properties of Alexandroff spaces are characterized in terms of their order. Complementation in the lattice of topologies on a set and in the lattice of convex topologies on a partially ordered set will be discussed
Structure vs. Randomness for Bilinear Maps
We prove that the slice rank of a 3-tensor (a combinatorial notion introduced
by Tao in the context of the cap-set problem), the analytic rank (a
Fourier-theoretic notion introduced by Gowers and Wolf), and the geometric rank
(a recently introduced algebro-geometric notion) are all equivalent up to an
absolute constant. As a corollary, we obtain strong trade-offs on the
arithmetic complexity of a biased bililnear map, and on the separation between
computing a bilinear map exactly and on average. Our result settles open
questions of Haramaty and Shpilka [STOC 2010], and of Lovett [Discrete Anal.,
2019] for 3-tensors.Comment: Submitted on November 6, 2020 to the 53rd Annual ACM Symposium on
Theory of Computing (STOC). Accepted on February 6, 202
- …