46 research outputs found

    Algebraic Methods in Computational Complexity

    Get PDF
    From 11.10. to 16.10.2009, the Dagstuhl Seminar 09421 “Algebraic Methods in Computational Complexity “ was held in Schloss Dagstuhl-Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    On the Impossibility of Probabilistic Proofs in Relativized Worlds

    Get PDF
    We initiate the systematic study of probabilistic proofs in relativized worlds, where the goal is to understand, for a given oracle, the possibility of "non-trivial" proof systems for deterministic or nondeterministic computations that make queries to the oracle. This question is intimately related to a recent line of work that seeks to improve the efficiency of probabilistic proofs for computations that use functionalities such as cryptographic hash functions and digital signatures, by instantiating them via constructions that are "friendly" to known constructions of probabilistic proofs. Informally, negative results about probabilistic proofs in relativized worlds provide evidence that this line of work is inherent and, conversely, positive results provide a way to bypass it. We prove several impossibility results for probabilistic proofs relative to natural oracles. Our results provide strong evidence that tailoring certain natural functionalities to known probabilistic proofs is inherent

    A Relativization Perspective on Meta-Complexity

    Get PDF
    Meta-complexity studies the complexity of computational problems about complexity theory, such as the Minimum Circuit Size Problem (MCSP) and its variants. We show that a relativization barrier applies to many important open questions in meta-complexity. We give relativized worlds where: 1) MCSP can be solved in deterministic polynomial time, but the search version of MCSP cannot be solved in deterministic polynomial time, even approximately. In contrast, Carmosino, Impagliazzo, Kabanets, Kolokolova [CCC'16] gave a randomized approximate search-to-decision reduction for MCSP with a relativizing proof. 2) The complexities of MCSP[2^{n/2}] and MCSP[2^{n/4}] are different, in both worst-case and average-case settings. Thus the complexity of MCSP is not "robust" to the choice of the size function. 3) Levin’s time-bounded Kolmogorov complexity Kt(x) can be approximated to a factor (2+ε) in polynomial time, for any ε > 0. 4) Natural proofs do not exist, and neither do auxiliary-input one-way functions. In contrast, Santhanam [ITCS'20] gave a relativizing proof that the non-existence of natural proofs implies the existence of one-way functions under a conjecture about optimal hitting sets. 5) DistNP does not reduce to GapMINKT by a family of "robust" reductions. This presents a technical barrier for solving a question of Hirahara [FOCS'20]

    A Variational Tate Conjecture in crystalline cohomology

    Full text link
    Given a smooth, proper family of varieties in characteristic p>0p>0, and a cycle zz on a fibre of the family, we formulate a Variational Tate Conjecture characterising, in terms of the crystalline cycle class of zz, whether zz extends cohomologically to the entire family. This is a characteristic pp analogue of Grothendieck's Variational Hodge Conjecture. We prove the conjecture for divisors, and an infinitesimal variant of the conjecture for cycles of higher codimension. This can be used to reduce the â„“\ell-adic Tate conjecture for divisors over finite fields to the case of surfaces.Comment: The former Section 3.3, containing a sketched treatment of line bundles with Q_p-coefficients, has been removed as an error was found. This affects the validity of none of the main results, but necessitates giving a different proof of the application to the Tate Conjecture. Other minor change

    09421 Abstracts Collection -- Algebraic Methods in Computational Complexity

    Get PDF
    From 11.10. to 16.10.2009, the Dagstuhl Seminar 09421 ``Algebraic Methods in Computational Complexity \u27\u27 was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to extended abstracts or full papers are provided, if available

    Bounded Relativization

    Get PDF
    Relativization is one of the most fundamental concepts in complexity theory, which explains the difficulty of resolving major open problems. In this paper, we propose a weaker notion of relativization called bounded relativization. For a complexity class ?, we say that a statement is ?-relativizing if the statement holds relative to every oracle ? ? ?. It is easy to see that every result that relativizes also ?-relativizes for every complexity class ?. On the other hand, we observe that many non-relativizing results, such as IP = PSPACE, are in fact PSPACE-relativizing. First, we use the idea of bounded relativization to obtain new lower bound results, including the following nearly maximum circuit lower bound: for every constant ? > 0, BPE^{MCSP}/2^{?n} ? SIZE[2?/n]. We prove this by PSPACE-relativizing the recent pseudodeterministic pseudorandom generator by Lu, Oliveira, and Santhanam (STOC 2021). Next, we study the limitations of PSPACE-relativizing proof techniques, and show that a seemingly minor improvement over the known results using PSPACE-relativizing techniques would imply a breakthrough separation NP ? L. For example: - Impagliazzo and Wigderson (JCSS 2001) proved that if EXP ? BPP, then BPP admits infinitely-often subexponential-time heuristic derandomization. We show that their result is PSPACE-relativizing, and that improving it to worst-case derandomization using PSPACE-relativizing techniques implies NP ? L. - Oliveira and Santhanam (STOC 2017) recently proved that every dense subset in P admits an infinitely-often subexponential-time pseudodeterministic construction, which we observe is PSPACE-relativizing. Improving this to almost-everywhere (pseudodeterministic) or (infinitely-often) deterministic constructions by PSPACE-relativizing techniques implies NP ? L. - Santhanam (SICOMP 2009) proved that pr-MA does not have fixed polynomial-size circuits. This lower bound can be shown PSPACE-relativizing, and we show that improving it to an almost-everywhere lower bound using PSPACE-relativizing techniques implies NP ? L. In fact, we show that if we can use PSPACE-relativizing techniques to obtain the above-mentioned improvements, then PSPACE ? EXPH. We obtain our barrier results by constructing suitable oracles computable in EXPH relative to which these improvements are impossible

    Computation Environments, An Interactive Semantics for Turing Machines (which P is not equal to NP considering it)

    Full text link
    To scrutinize notions of computation and time complexity, we introduce and formally define an interactive model for computation that we call it the \emph{computation environment}. A computation environment consists of two main parts: i) a universal processor and ii) a computist who uses the computability power of the universal processor to perform effective procedures. The notion of computation finds it meaning, for the computist, through his \underline{interaction} with the universal processor. We are interested in those computation environments which can be considered as alternative for the real computation environment that the human being is its computist. These computation environments must have two properties: 1- being physically plausible, and 2- being enough powerful. Based on Copeland' criteria for effective procedures, we define what a \emph{physically plausible} computation environment is. We construct two \emph{physically plausible} and \emph{enough powerful} computation environments: 1- the Turing computation environment, denoted by ETE_T, and 2- a persistently evolutionary computation environment, denoted by EeE_e, which persistently evolve in the course of executing the computations. We prove that the equality of complexity classes P\mathrm{P} and NP\mathrm{NP} in the computation environment EeE_e conflicts with the \underline{free will} of the computist. We provide an axiomatic system T\mathcal{T} for Turing computability and prove that ignoring just one of the axiom of T\mathcal{T}, it would not be possible to derive P=NP\mathrm{P=NP} from the rest of axioms. We prove that the computist who lives inside the environment ETE_T, can never be confident that whether he lives in a static environment or a persistently evolutionary one.Comment: 33 pages, interactive computation, P vs N
    corecore