4 research outputs found

    Stretching demi-bits and nondeterministic-secure pseudorandomness

    Get PDF
    We develop the theory of cryptographic nondeterministic-secure pseudorandomness beyond the point reached by Rudich's original work [25], and apply it to draw new consequences in average-case complexity and proof complexity. Specifically, we show the following: Demi-bit stretch: Super-bits and demi-bits are variants of cryptographic pseudorandom generators which are secure against nondeterministic statistical tests [25]. They were introduced to rule out certain approaches to proving strong complexity lower bounds beyond the limitations set out by the Natural Proofs barrier of Razborov and Rudich [23]. Whether demi-bits are stretchable at all had been an open problem since their introduction. We answer this question affirmatively by showing that: every demi-bit b : {0, 1}n → {0, 1}n+1 can be stretched into sublinear many demi-bits b′: {0, 1}n → {0, 1}n+nc , for every constant 0 < c < 1. Average-case hardness: Using work by Santhanam [26], we apply our results to obtain new averagecase Kolmogorov complexity results: we show that Kpoly[n-O(1)] is zero-error average-case hard against NP/poly machines iff Kpoly[n-o(n)] is, where for a function s(n) : N → N, Kpoly[s(n)] denotes the languages of all strings x ∈ {0, 1}n for which there are (fixed) polytime Turing machines of description-length at most s(n) that output x. Characterising super-bits by nondeterministic unpredictability: In the deterministic setting, Yao [31] proved that super-polynomial hardness of pseudorandom generators is equivalent to ("nextbit") unpredictability. Unpredictability roughly means that given any strict prefix of a random string, it is infeasible to predict the next bit. We initiate the study of unpredictability beyond the deterministic setting (in the cryptographic regime), and characterise the nondeterministic hardness of generators from an unpredictability perspective. Specifically, we propose four stronger notions of unpredictability: NP/poly-unpredictability, coNP/poly-unpredictability, ∩-unpredictability and ∪unpredictability, and show that super-polynomial nondeterministic hardness of generators lies between ∩-unpredictability and ∪unpredictability. Characterising super-bits by nondeterministic hard-core predicates: We introduce a nondeterministic variant of hard-core predicates, called super-core predicates. We show that the existence of a super-bit is equivalent to the existence of a super-core of some non-shrinking function. This serves as an analogue of the equivalence between the existence of a strong pseudorandom generator and the existence of a hard-core of some one-way function [8, 12], and provides a first alternative characterisation of super-bits. We also prove that a certain class of functions, which may have hard-cores, cannot possess any super-core

    Iterated lower bound formulas: a diagonalization-based approach to proof complexity

    Get PDF
    We propose a diagonalization-based approach to several important questions in proof complexity. We illustrate this approach in the context of the algebraic proof system IPS and in the context of propositional proof systems more generally. We use the approach to give an explicit sequence of CNF formulas {φn} such that VNP ≠ VP iff there are no polynomial-size IPS proofs for the formulas φn. This provides a natural equivalence between proof complexity lower bounds and standard algebraic complexity lower bounds. Our proof of this fact uses the implication from IPS lower bounds to algebraic complexity lower bounds due to Grochow and Pitassi together with a diagonalization argument: the formulas φn themselves assert the non-existence of short IPS proofs for formulas encoding VNP ≠ VP at a different input length. Our result also has meta-mathematical implications: it gives evidence for the difficulty of proving strong lower bounds for IPS within IPS. For any strong enough propositional proof system R, we define the *iterated R-lower bound formulas*, which inductively assert the non-existence of short R proofs for formulas encoding the same statement at a different input length, and propose them as explicit hard candidates for the proof system R. We observe that this hypothesis holds for Resolution following recent results of Atserias and Muller and of Garlik, and give evidence in favour of it for other proof systems

    Why are Proof Complexity Lower Bounds Hard

    No full text
    We formalize and study the question of whether there are inherent difficulties to showing lower bounds on propositional proof complexity. We establish the following unconditional result: Propositional proof systems cannot efficiently show that truth tables of random Boolean functions lack polynomial size nonuniform proofs of hardness. Assuming a conjecture of Rudich, propositional proof systems also cannot efficiently show that random k-CNFs of linear density lack polynomial size non-uniform proofs of unsatisfiability. Since the statements in question assert the average-case hardness of standard NP problems (MCSP and 3-SAT respectively) against co-nondeterministic circuits for natural distributions, one interpretation of our result is that propositional proof systems are inherently incapable of efficiently proving strong complexity lower bounds in our formalization. Another interpretation is that an analogue of the Razborov-Rudich â natural proofsâ barrier holds in proof complexity: under reasonable hardness assumptions, there are natural distributions on hard tautologies for which it is infeasible to show proof complexity lower bounds for strong enough proof systems. For the specific case of the Extended Frege (EF) propositional proof system, we show that at least one of the following cases holds: (1) EF has no efficient proofs of superpolynomial circuit lower bound tautologies for any Boolean function or (2) There is an explicit family of tautologies of each length such that under reasonable hardness assumptions, most tautologies are hard but no propositional proof system can efficiently establish hardness for most tautologies in the family. Thus, under reasonable hardness assumptions, either the Circuit Lower Bounds program toward complexity separations cannot be implemented in EF, or there are inherent obstacles to implementing the Cook-Reckhow program for EF. This is a joint work with Rahul Santhanam.Non UBCUnreviewedAuthor affiliation: University of OxfordPostdoctora

    Why are proof complexity lower bounds hard?

    No full text
    We formalize and study the question of whether there are inherent difficulties to showing lower bounds on propositional proof complexity. We establish the following unconditional result: Propositional proof systems cannot efficiently show that truth tables of random Boolean functions lack polynomial size non-uniform proofs of hardness. Assuming a conjecture of Rudich, propositional proof systems also cannot efficiently show that random k-CNFs of linear density lack polynomial size non-uniform proofs of unsatisfiability. Since the statements in question assert the average-case hardness of standard NP problems (MCSP and 3-SAT respectively) against co-nondeterministic circuits for natural distributions, one interpretation of our result is that propositional proof systems are inherently incapable of efficiently proving strong complexity lower bounds in our formalization. Another interpretation is that an analogue of the Razborov-Rudich `natural proofs' barrier holds in proof complexity: under reasonable hardness assumptions, there are natural distributions on hard tautologies for which it is infeasible to show proof complexity lower bounds for strong enough proof systems. For the specific case of the Extended Frege (EF) propositional proof system, we show that at least one of the following cases holds: (1) EF has no efficient proofs of superpolynomial circuit lower bound tautologies for any Boolean function or (2) There is an explicit family of tautologies of each length such that under reasonable hardness assumptions, most tautologies are hard but no propositional proof system can efficiently establish hardness for most tautologies in the family. Thus, under reasonable hardness assumptions, either the Circuit Lower Bounds program toward complexity separations cannot be implemented in EF, or there are inherent obstacles to implementing the Cook-Reckhow program for EF
    corecore