7,280 research outputs found
An Atypical Survey of Typical-Case Heuristic Algorithms
Heuristic approaches often do so well that they seem to pretty much always
give the right answer. How close can heuristic algorithms get to always giving
the right answer, without inducing seismic complexity-theoretic consequences?
This article first discusses how a series of results by Berman, Buhrman,
Hartmanis, Homer, Longpr\'{e}, Ogiwara, Sch\"{o}ening, and Watanabe, from the
early 1970s through the early 1990s, explicitly or implicitly limited how well
heuristic algorithms can do on NP-hard problems. In particular, many desirable
levels of heuristic success cannot be obtained unless severe, highly unlikely
complexity class collapses occur. Second, we survey work initiated by Goldreich
and Wigderson, who showed how under plausible assumptions deterministic
heuristics for randomized computation can achieve a very high frequency of
correctness. Finally, we consider formal ways in which theory can help explain
the effectiveness of heuristics that solve NP-hard problems in practice.Comment: This article is currently scheduled to appear in the December 2012
issue of SIGACT New
Limits of Preprocessing
We present a first theoretical analysis of the power of polynomial-time
preprocessing for important combinatorial problems from various areas in AI. We
consider problems from Constraint Satisfaction, Global Constraints,
Satisfiability, Nonmonotonic and Bayesian Reasoning. We show that, subject to a
complexity theoretic assumption, none of the considered problems can be reduced
by polynomial-time preprocessing to a problem kernel whose size is polynomial
in a structural problem parameter of the input, such as induced width or
backdoor size. Our results provide a firm theoretical boundary for the
performance of polynomial-time preprocessing algorithms for the considered
problems.Comment: This is a slightly longer version of a paper that appeared in the
proceedings of AAAI 201
Oracles Are Subtle But Not Malicious
Theoretical computer scientists have been debating the role of oracles since
the 1970's. This paper illustrates both that oracles can give us nontrivial
insights about the barrier problems in circuit complexity, and that they need
not prevent us from trying to solve those problems.
First, we give an oracle relative to which PP has linear-sized circuits, by
proving a new lower bound for perceptrons and low- degree threshold
polynomials. This oracle settles a longstanding open question, and generalizes
earlier results due to Beigel and to Buhrman, Fortnow, and Thierauf. More
importantly, it implies the first nonrelativizing separation of "traditional"
complexity classes, as opposed to interactive proof classes such as MIP and
MA-EXP. For Vinodchandran showed, by a nonrelativizing argument, that PP does
not have circuits of size n^k for any fixed k. We present an alternative proof
of this fact, which shows that PP does not even have quantum circuits of size
n^k with quantum advice. To our knowledge, this is the first nontrivial lower
bound on quantum circuit size.
Second, we study a beautiful algorithm of Bshouty et al. for learning Boolean
circuits in ZPP^NP. We show that the NP queries in this algorithm cannot be
parallelized by any relativizing technique, by giving an oracle relative to
which ZPP^||NP and even BPP^||NP have linear-size circuits. On the other hand,
we also show that the NP queries could be parallelized if P=NP. Thus, classes
such as ZPP^||NP inhabit a "twilight zone," where we need to distinguish
between relativizing and black-box techniques. Our results on this subject have
implications for computational learning theory as well as for the circuit
minimization problem.Comment: 20 pages, 1 figur
- …