70 research outputs found

    Unexpected Power of Random Strings

    Get PDF

    Quantum and Probabilistic Computers Rigorously Powerful than Traditional Computers, and Derandomization

    Full text link
    In this paper, we extend the techniques used in our previous work to show that there exists a probabilistic Turing machine running within time O(nk)O(n^k) for all kN1k\in\mathbb{N}_1 accepting a language LdL_d which is different from any language in P\mathcal{P}, and then to show that LdBPPL_d\in\mathcal{BPP}, thus separating the complexity classes P\mathcal{P} and BPP\mathcal{BPP} (i.e., PBPP\mathcal{P}\subsetneq\mathcal{BPP}). Since the complexity class of {\em bounded error quantum polynomial-time} BQP\mathcal{BQP} contains the complexity class BPP\mathcal{BPP}, i.e., BPPBQP\mathcal{BPP}\subseteq\mathcal{BQP}, we thus obtain the result that quantum computers are {\em rigorously powerful than} traditional computers. Namely, PBQP\mathcal{P}\subsetneq\mathcal{BQP}. We further show that (1). PRP\mathcal{P}\subsetneq\mathcal{RP}; (2). Pco-RP\mathcal{P}\subsetneq\text{co-}\mathcal{RP}; (3). PZPP\mathcal{P}\subsetneq\mathcal{ZPP}. The result of PBPP\mathcal{P}\subsetneq\mathcal{BPP} shows that {\em randomness} plays an essential role in probabilistic algorithm design. Specifically, we show that: (1). The number of random bits used by any probabilistic algorithm which accepts the language LdL_d can not be reduced to O(logn)O(\log n); (2). There exits no efficient (complexity-theoretic) {\em pseudorandom generator} (PRG) G:{0,1}O(logn){0,1}nG:\{0,1\}^{O(\log n)}\rightarrow \{0,1\}^n; (3). There exists no quick HSG H:k(n)nH:k(n)\rightarrow n such that k(n)=O(logn)k(n)=O(\log n).Comment: [v3] references added; minor revised; 31 pages. arXiv admin note: text overlap with arXiv:2110.0621

    Pseudorandomness for Approximate Counting and Sampling

    Get PDF
    We study computational procedures that use both randomness and nondeterminism. The goal of this paper is to derandomize such procedures under the weakest possible assumptions. Our main technical contribution allows one to “boost” a given hardness assumption: We show that if there is a problem in EXP that cannot be computed by poly-size nondeterministic circuits then there is one which cannot be computed by poly-size circuits that make non-adaptive NP oracle queries. This in particular shows that the various assumptions used over the last few years by several authors to derandomize Arthur-Merlin games (i.e., show AM = NP) are in fact all equivalent. We also define two new primitives that we regard as the natural pseudorandom objects associated with approximate counting and sampling of NP-witnesses. We use the “boosting” theorem and hashing techniques to construct these primitives using an assumption that is no stronger than that used to derandomize AM. We observe that Cai's proof that S_2^P ⊆ PP⊆(NP) and the learning algorithm of Bshouty et al. can be seen as reductions to sampling that are not probabilistic. As a consequence they can be derandomized under an assumption which is weaker than the assumption that was previously known to suffice

    Derandomizing Arthur-Merlin Games using Hitting Sets

    Get PDF
    We prove that AM (and hence Graph Nonisomorphism) is in NPif for some epsilon > 0, some language in NE intersection coNE requires nondeterministiccircuits of size 2^(epsilon n). This improves recent results of Arvindand K¨obler and of Klivans and Van Melkebeek who proved the sameconclusion, but under stronger hardness assumptions, namely, eitherthe existence of a language in NE intersection coNE which cannot be approximatedby nondeterministic circuits of size less than 2^(epsilon n) or the existenceof a language in NE intersection coNE which requires oracle circuits of size 2^(epsilon n)with oracle gates for SAT (satisfiability).The previous results on derandomizing AM were based on pseudorandomgenerators. In contrast, our approach is based on a strengtheningof Andreev, Clementi and Rolim's hitting set approach to derandomization.As a spin-off, we show that this approach is strong enoughto give an easy (if the existence of explicit dispersers can be assumedknown) proof of the following implication: For some epsilon > 0, if there isa language in E which requires nondeterministic circuits of size 2^(epsilon n),then P=BPP. This differs from Impagliazzo and Wigderson's theorem"only" by replacing deterministic circuits with nondeterministicones

    Tighter Connections between Derandomization and Circuit Lower Bounds

    Get PDF
    We tighten the connections between circuit lower bounds and derandomization for each of the following three types of derandomization: - general derandomization of promiseBPP (connected to Boolean circuits), - derandomization of Polynomial Identity Testing (PIT) over fixed finite fields (connected to arithmetic circuit lower bounds over the same field), and - derandomization of PIT over the integers (connected to arithmetic circuit lower bounds over the integers). We show how to make these connections uniform equivalences, although at the expense of using somewhat less common versions of complexity classes and for a less studied notion of inclusion. Our main results are as follows: 1. We give the first proof that a non-trivial (nondeterministic subexponential-time) algorithm for PIT over a fixed finite field yields arithmetic circuit lower bounds. 2. We get a similar result for the case of PIT over the integers, strengthening a result of Jansen and Santhanam [JS12] (by removing the need for advice). 3. We derive a Boolean circuit lower bound for NEXP intersect coNEXP from the assumption of sufficiently strong non-deterministic derandomization of promiseBPP (without advice), as well as from the assumed existence of an NP-computable non-empty property of Boolean functions useful for proving superpolynomial circuit lower bounds (in the sense of natural proofs of [RR97]); this strengthens the related results of [IKW02]. 4. Finally, we turn all of these implications into equivalences for appropriately defined promise classes and for a notion of robust inclusion/separation (inspired by [FS11]) that lies between the classical "almost everywhere" and "infinitely often" notions

    Instance-Wise Hardness Versus Randomness Tradeoffs for Arthur-Merlin Protocols

    Get PDF

    Algebraic Hardness Versus Randomness in Low Characteristic

    Get PDF
    We show that lower bounds for explicit constant-variate polynomials over fields of characteristic p > 0 are sufficient to derandomize polynomial identity testing over fields of characteristic p. In this setting, existing work on hardness-randomness tradeoffs for polynomial identity testing requires either the characteristic to be sufficiently large or the notion of hardness to be stronger than the standard syntactic notion of hardness used in algebraic complexity. Our results make no restriction on the characteristic of the field and use standard notions of hardness. We do this by combining the Kabanets-Impagliazzo generator with a white-box procedure to take p-th roots of circuits computing a p-th power over fields of characteristic p. When the number of variables appearing in the circuit is bounded by some constant, this procedure turns out to be efficient, which allows us to bypass difficulties related to factoring circuits in characteristic p. We also combine the Kabanets-Impagliazzo generator with recent "bootstrapping" results in polynomial identity testing to show that a sufficiently-hard family of explicit constant-variate polynomials yields a near-complete derandomization of polynomial identity testing. This result holds over fields of both zero and positive characteristic and complements a recent work of Guo, Kumar, Saptharishi, and Solomon, who obtained a slightly stronger statement over fields of characteristic zero

    Pseudodistributions That Beat All Pseudorandom Generators (Extended Abstract)

    Get PDF
    corecore