1,135 research outputs found
On the proof complexity of Paris-harrington and off-diagonal ramsey tautologies
We study the proof complexity of Paris-Harrington’s Large Ramsey Theorem for bi-colorings of graphs and
of off-diagonal Ramsey’s Theorem. For Paris-Harrington, we prove a non-trivial conditional lower bound
in Resolution and a non-trivial upper bound in bounded-depth Frege. The lower bound is conditional on a
(very reasonable) hardness assumption for a weak (quasi-polynomial) Pigeonhole principle in RES(2). We
show that under such an assumption, there is no refutation of the Paris-Harrington formulas of size quasipolynomial
in the number of propositional variables. The proof technique for the lower bound extends the
idea of using a combinatorial principle to blow up a counterexample for another combinatorial principle
beyond the threshold of inconsistency. A strong link with the proof complexity of an unbalanced off-diagonal
Ramsey principle is established. This is obtained by adapting some constructions due to Erdos and Mills. Ëť
We prove a non-trivial Resolution lower bound for a family of such off-diagonal Ramsey principles
Alternation-Trading Proofs, Linear Programming, and Lower Bounds
A fertile area of recent research has demonstrated concrete polynomial time
lower bounds for solving natural hard problems on restricted computational
models. Among these problems are Satisfiability, Vertex Cover, Hamilton Path,
Mod6-SAT, Majority-of-Majority-SAT, and Tautologies, to name a few. The proofs
of these lower bounds follow a certain proof-by-contradiction strategy that we
call alternation-trading. An important open problem is to determine how
powerful such proofs can possibly be.
We propose a methodology for studying these proofs that makes them amenable
to both formal analysis and automated theorem proving. We prove that the search
for better lower bounds can often be turned into a problem of solving a large
series of linear programming instances. Implementing a small-scale theorem
prover based on this result, we extract new human-readable time lower bounds
for several problems. This framework can also be used to prove concrete
limitations on the current techniques.Comment: To appear in STACS 2010, 12 page
Compilability of Abduction
Abduction is one of the most important forms of reasoning; it has been
successfully applied to several practical problems such as diagnosis. In this
paper we investigate whether the computational complexity of abduction can be
reduced by an appropriate use of preprocessing. This is motivated by the fact
that part of the data of the problem (namely, the set of all possible
assumptions and the theory relating assumptions and manifestations) are often
known before the rest of the problem. In this paper, we show some complexity
results about abduction when compilation is allowed
The ghosts of forgotten things: A study on size after forgetting
Forgetting is removing variables from a logical formula while preserving the
constraints on the other variables. In spite of being a form of reduction, it
does not always decrease the size of the formula and may sometimes increase it.
This article discusses the implications of such an increase and analyzes the
computational properties of the phenomenon. Given a propositional Horn formula,
a set of variables and a maximum allowed size, deciding whether forgetting the
variables from the formula can be expressed in that size is -hard in
. The same problem for unrestricted propositional formulae is
-hard in . The hardness results employ superredundancy: a
superirredundant clause is in all formulae of minimal size equivalent to a
given one. This concept may be useful outside forgetting
- …