16 research outputs found

    Papers to Appear

    Get PDF

    Bounds on BDD-Based Bucket Elimination

    Get PDF
    We study BDD-based bucket elimination, an approach to satisfiability testing using variable elimination which has seen several practical implementations in the past. We prove that it allows solving the standard pigeonhole principle formulas efficiently, when allowing different orders for variable elimination and BDD-representations, a variant of bucket elimination that was recently introduced. Furthermore, we show that this upper bound is somewhat brittle as for formulas which we get from the pigeonhole principle by restriction, i.e., fixing some of the variables, the same approach with the same variable orders has exponential runtime. We also show that the more common implementation of bucket elimination using the same order for variable elimination and the BDDs has exponential runtime for the pigeonhole principle when using either of the two orders from our upper bound, which suggests that the combination of both is the key to efficiency in the setting

    A Generalized Method for Proving Polynomial Calculus Degree Lower Bounds

    Full text link
    We study the problem of obtaining lower bounds for polynomial calculus (PC) and polynomial calculus resolution (PCR) on proof degree, and hence by [Impagliazzo et al. '99] also on proof size. [Alekhnovich and Razborov '03] established that if the clause-variable incidence graph of a CNF formula F is a good enough expander, then proving that F is unsatisfiable requires high PC/PCR degree. We further develop the techniques in [AR03] to show that if one can "cluster" clauses and variables in a way that "respects the structure" of the formula in a certain sense, then it is sufficient that the incidence graph of this clustered version is an expander. As a corollary of this, we prove that the functional pigeonhole principle (FPHP) formulas require high PC/PCR degree when restricted to constant-degree expander graphs. This answers an open question in [Razborov '02], and also implies that the standard CNF encoding of the FPHP formulas require exponential proof size in polynomial calculus resolution. Thus, while Onto-FPHP formulas are easy for polynomial calculus, as shown in [Riis '93], both FPHP and Onto-PHP formulas are hard even when restricted to bounded-degree expanders.Comment: Full-length version of paper to appear in Proceedings of the 30th Annual Computational Complexity Conference (CCC '15), June 201

    Bounded-Depth Frege Complexity of Tseitin Formulas for All Graphs

    Get PDF
    We prove that there is a constant K such that Tseitin formulas for an undirected graph G requires proofs of size 2tw(G)Ω(1/d) in depth-d Frege systems for d < (Formula presented.) where tw(G) is the treewidth of G. This extends Håstad recent lower bound for the grid graph to any graph. Furthermore, we prove tightness of our bound up to a multiplicative constant in the top exponent. Namely, we show that if a Tseitin formula for a graph G has size s, then for all large enough d, it has a depth-d Frege proof of size 2tw(G)O(1/d)poly(s). Through this result we settle the question posed by M. Alekhnovich and A. Razborov of showing that the class of Tseitin formulas is quasi-automatizable for resolution

    Structure and Problem Hardness: Goal Asymmetry and DPLL Proofs in<br> SAT-Based Planning

    Full text link
    In Verification and in (optimal) AI Planning, a successful method is to formulate the application as boolean satisfiability (SAT), and solve it with state-of-the-art DPLL-based procedures. There is a lack of understanding of why this works so well. Focussing on the Planning context, we identify a form of problem structure concerned with the symmetrical or asymmetrical nature of the cost of achieving the individual planning goals. We quantify this sort of structure with a simple numeric parameter called AsymRatio, ranging between 0 and 1. We run experiments in 10 benchmark domains from the International Planning Competitions since 2000; we show that AsymRatio is a good indicator of SAT solver performance in 8 of these domains. We then examine carefully crafted synthetic planning domains that allow control of the amount of structure, and that are clean enough for a rigorous analysis of the combinatorial search space. The domains are parameterized by size, and by the amount of structure. The CNFs we examine are unsatisfiable, encoding one planning step less than the length of the optimal plan. We prove upper and lower bounds on the size of the best possible DPLL refutations, under different settings of the amount of structure, as a function of size. We also identify the best possible sets of branching variables (backdoors). With minimum AsymRatio, we prove exponential lower bounds, and identify minimal backdoors of size linear in the number of variables. With maximum AsymRatio, we identify logarithmic DPLL refutations (and backdoors), showing a doubly exponential gap between the two structural extreme cases. The reasons for this behavior -- the proof arguments -- illuminate the prototypical patterns of structure causing the empirical behavior observed in the competition benchmarks

    Sum-Of-Squares Lower Bounds for the Minimum Circuit Size Problem

    Get PDF
    We prove lower bounds for the Minimum Circuit Size Problem (MCSP) in the Sum-of-Squares (SoS) proof system. Our main result is that for every Boolean function f: {0,1}? ? {0,1}, SoS requires degree ?(s^{1-?}) to prove that f does not have circuits of size s (for any s > poly(n)). As a corollary we obtain that there are no low degree SoS proofs of the statement NP ? P/poly. We also show that for any 0 < ? < 1 there are Boolean functions with circuit complexity larger than 2^{n^?} but SoS requires size 2^{2^?(n^?)} to prove this. In addition we prove analogous results on the minimum monotone circuit size for monotone Boolean slice functions. Our approach is quite general. Namely, we show that if a proof system Q has strong enough constraint satisfaction problem lower bounds that only depend on good expansion of the constraint-variable incidence graph and, furthermore, Q is expressive enough that variables can be substituted by local Boolean functions, then the MCSP problem is hard for Q

    A Generalized Method for Proving Polynomial Calculus Degree Lower Bounds

    Get PDF
    We study the problem of establishing lower bounds for polynomial calculus (PC) and polynomial calculus resolution (PCR) on proof degree, and hence by [Impagliazzo et al. \u2799] also on proof size. [Alekhnovich and Razborov \u2703] established that if the clause-variable incidence graph of a CNF formula F is a good enough expander, then proving that F is unsatisfiable requires high PC/PCR degree. We further develop the techniques in [AR03] to show that if one can "cluster" clauses and variables in a way that "respects the structure" of the formula in a certain sense, then it is sufficient that the incidence graph of this clustered version is an expander. As a corollary of this, we prove that the functional pigeonhole principle (FPHP) formulas require high PC/PCR degree when restricted to constant-degree expander graphs. This answers an open question in [Razborov \u2702], and also implies that the standard CNF encoding of the FPHP formulas require exponential proof size in polynomial calculus resolution

    Exponential Resolution Lower Bounds for Weak Pigeonhole Principle and Perfect Matching Formulas over Sparse Graphs

    Get PDF
    We show exponential lower bounds on resolution proof length for pigeonhole principle (PHP) formulas and perfect matching formulas over highly unbalanced, sparse expander graphs, thus answering the challenge to establish strong lower bounds in the regime between balanced constant-degree expanders as in [Ben-Sasson and Wigderson '01] and highly unbalanced, dense graphs as in [Raz '04] and [Razborov '03, '04]. We obtain our results by revisiting Razborov's pseudo-width method for PHP formulas over dense graphs and extending it to sparse graphs. This further demonstrates the power of the pseudo-width method, and we believe it could potentially be useful for attacking also other longstanding open problems for resolution and other proof systems

    Resource Bounded Unprovability of Computational Lower Bounds

    Full text link
    This paper introduces new notions of asymptotic proofs, PT(polynomial-time)-extensions, PTM(polynomial-time Turing machine)-omega-consistency, etc. on formal theories of arithmetic including PA (Peano Arithmetic). This paper shows that P not= NP (more generally, any super-polynomial-time lower bound in PSPACE) is unprovable in a PTM-omega-consistent theory T, where T is a consistent PT-extension of PA. This result gives a unified view to the existing two major negative results on proving P not= NP, Natural Proofs and relativizable proofs, through the two manners of characterization of PTM-omega-consistency. We also show that the PTM-omega-consistency of T cannot be proven in any PTM-omega-consistent theory S, where S is a consistent PT-extension of T.Comment: 78 page
    corecore