27 research outputs found
Polylogarithmic Cuts in Models of V^0
We study initial cuts of models of weak two-sorted Bounded Arithmetics with
respect to the strength of their theories and show that these theories are
stronger than the original one. More explicitly we will see that
polylogarithmic cuts of models of are models of
by formalizing a proof of Nepomnjascij's Theorem in such cuts. This is a
strengthening of a result by Paris and Wilkie. We can then exploit our result
in Proof Complexity to observe that Frege proof systems can be sub
exponentially simulated by bounded depth Frege proof systems. This result has
recently been obtained by Filmus, Pitassi and Santhanam in a direct proof. As
an interesting observation we also obtain an average case separation of
Resolution from AC0-Frege by applying a recent result with Tzameret.Comment: 16 page
The Cook-Reckhow definition
The Cook-Reckhow 1979 paper defined the area of research we now call Proof
Complexity. There were earlier papers which contributed to the subject as we
understand it today, the most significant being Tseitin's 1968 paper, but none
of them introduced general notions that would allow to make an explicit and
universal link between lengths-of-proofs problems and computational complexity
theory. In this note we shall highlight three particular definitions from the
paper: of proof systems, p-simulations and the pigeonhole principle formula,
and discuss their role in defining the field. We will also mention some related
developments and open problems
On the pigeonhole and the modular counting principles over the bounded arithmetic (Theory and Applications of Proof and Computation)
The theorem of Ajtai ([1], improved by [11] and [12]), which shows a superpolynomial lower bound for ACâ°-Frege proofs of the pigeonhole principle, was a significant breakthrough of proof complexity and has been inspiring many other important works considering the strengths of modular counting principles and the pigeonhole principle. In terms of bounded arithmetics, the theorem implies that the pigeonhole principle is independent from the bounded arithmetic Vâ°. Along the stream of researches, [7] gave the following conjectures and showed some sufficient conditions to prove them: ă»Vâ° ïŒ UCP[l, d k] â PHP[n+1 n].ă»For any prime number p other than 2, Vâ° ïŒ oddtownk â Count[p n].ă»For any integer p â„ 2, Vâ° ïŒ FIEk â Count[p n]. Here, injPHP[n+1 n] is a formalization of the pigeonhole principle for injections, UCP[l, d k] is the uniform counting principle defined in [7], Count[p n] is the modular counting principle mod p, oddtownk is a formalization of odd town theorem, and FIEk is a formalization of Fisher's inequality. In this article, we give a summary of the work of [7], supplement both technical parts and motivations of it, and propose the future perspective
Making proofs without Modus Ponens: An introduction to the combinatorics and complexity of cut elimination
This paper is intended to provide an introduction to cut elimination which is
accessible to a broad mathematical audience. Gentzen's cut elimination theorem
is not as well known as it deserves to be, and it is tied to a lot of
interesting mathematical structure. In particular we try to indicate some
dynamical and combinatorial aspects of cut elimination, as well as its
connections to complexity theory. We discuss two concrete examples where one
can see the structure of short proofs with cuts, one concerning feasible
numbers and the other concerning "bounded mean oscillation" from real analysis
From proof complexity to circuit complexity via interactive protocols
Folklore in complexity theory suspects that circuit lower bounds against NC1 or P/poly, currently
out of reach, are a necessary step towards proving strong proof complexity lower bounds for systems like Frege or Extended Frege. Establishing such a connection formally, however, is already daunting, as it would imply the breakthrough separation NEXP â P/poly, as recently observed by Pich and Santhanam [Pich and Santhanam, 2023].
We show such a connection conditionally for the Implicit Extended Frege proof system (iEF) introduced by KrajĂÄek [KrajĂÄek, 2004], capable of formalizing most of contemporary complexity theory. In particular, we show that if iEF proves efficiently the standard derandomization assumption that a concrete Boolean function is hard on average for subexponential-size circuits, then any superpolynomial lower bound on the length of iEF proofs implies #P â FP/poly (which would in turn imply, for example, PSPACE â P/poly). Our proof exploits the formalization inside iEF of the soundness of the sum-check protocol of Lund, Fortnow, Karloff, and Nisan [Lund et al., 1992]. This has consequences for the self-provability of circuit upper bounds in iEF. Interestingly, further improving our result seems to require progress in constructing interactive proof systems with more efficient provers