21,353 research outputs found

    Immunity and Simplicity for Exact Counting and Other Counting Classes

    Full text link
    Ko [RAIRO 24, 1990] and Bruschi [TCS 102, 1992] showed that in some relativized world, PSPACE (in fact, ParityP) contains a set that is immune to the polynomial hierarchy (PH). In this paper, we study and settle the question of (relativized) separations with immunity for PH and the counting classes PP, C_{=}P, and ParityP in all possible pairwise combinations. Our main result is that there is an oracle A relative to which C_{=}P contains a set that is immune to BPP^{ParityP}. In particular, this C_{=}P^A set is immune to PH^{A} and ParityP^{A}. Strengthening results of Tor\'{a}n [J.ACM 38, 1991] and Green [IPL 37, 1991], we also show that, in suitable relativizations, NP contains a C_{=}P-immune set, and ParityP contains a PP^{PH}-immune set. This implies the existence of a C_{=}P^{B}-simple set for some oracle B, which extends results of Balc\'{a}zar et al. [SIAM J.Comp. 14, 1985; RAIRO 22, 1988] and provides the first example of a simple set in a class not known to be contained in PH. Our proof technique requires a circuit lower bound for ``exact counting'' that is derived from Razborov's [Mat. Zametki 41, 1987] lower bound for majority.Comment: 20 page

    Levelable Sets and the Algebraic Structure of Parameterizations

    Get PDF
    Asking which sets are fixed-parameter tractable for a given parameterization constitutes much of the current research in parameterized complexity theory. This approach faces some of the core difficulties in complexity theory. By focussing instead on the parameterizations that make a given set fixed-parameter tractable, we circumvent these difficulties. We isolate parameterizations as independent measures of complexity and study their underlying algebraic structure. Thus we are able to compare parameterizations, which establishes a hierarchy of complexity that is much stronger than that present in typical parameterized algorithms races. Among other results, we find that no practically fixed-parameter tractable sets have optimal parameterizations

    Resource Bounded Immunity and Simplicity

    Get PDF
    Revisiting the thirty years-old notions of resource-bounded immunity and simplicity, we investigate the structural characteristics of various immunity notions: strong immunity, almost immunity, and hyperimmunity as well as their corresponding simplicity notions. We also study limited immunity and simplicity, called k-immunity and feasible k-immunity, and their simplicity notions. Finally, we propose the k-immune hypothesis as a working hypothesis that guarantees the existence of simple sets in NP.Comment: This is a complete version of the conference paper that appeared in the Proceedings of the 3rd IFIP International Conference on Theoretical Computer Science, Kluwer Academic Publishers, pp.81-95, Toulouse, France, August 23-26, 200

    Sculpting Quantum Speedups

    Get PDF
    Given a problem which is intractable for both quantum and classical algorithms, can we find a sub-problem for which quantum algorithms provide an exponential advantage? We refer to this problem as the "sculpting problem." In this work, we give a full characterization of sculptable functions in the query complexity setting. We show that a total function f can be restricted to a promise P such that Q(f|_P)=O(polylog(N)) and R(f|_P)=N^{Omega(1)}, if and only if f has a large number of inputs with large certificate complexity. The proof uses some interesting techniques: for one direction, we introduce new relationships between randomized and quantum query complexity in various settings, and for the other direction, we use a recent result from communication complexity due to Klartag and Regev. We also characterize sculpting for other query complexity measures, such as R(f) vs. R_0(f) and R_0(f) vs. D(f). Along the way, we prove some new relationships for quantum query complexity: for example, a nearly quadratic relationship between Q(f) and D(f) whenever the promise of f is small. This contrasts with the recent super-quadratic query complexity separations, showing that the maximum gap between classical and quantum query complexities is indeed quadratic in various settings - just not for total functions! Lastly, we investigate sculpting in the Turing machine model. We show that if there is any BPP-bi-immune language in BQP, then every language outside BPP can be restricted to a promise which places it in PromiseBQP but not in PromiseBPP. Under a weaker assumption, that some problem in BQP is hard on average for P/poly, we show that every paddable language outside BPP is sculptable in this way.Comment: 30 page

    Simplicity of Completion Time Distributions for Common Complex Biochemical Processes

    Full text link
    Biochemical processes typically involve huge numbers of individual reversible steps, each with its own dynamical rate constants. For example, kinetic proofreading processes rely upon numerous sequential reactions in order to guarantee the precise construction of specific macromolecules. In this work, we study the transient properties of such systems and fully characterize their first passage (completion) time distributions. In particular, we provide explicit expressions for the mean and the variance of the completion time for a kinetic proofreading process and computational analyses for more complicated biochemical systems. We find that, for a wide range of parameters, as the system size grows, the completion time behavior simplifies: it becomes either deterministic or exponentially distributed, with a very narrow transition between the two regimes. In both regimes, the dynamical complexity of the full system is trivial compared to its apparent structural complexity. Similar simplicity is likely to arise in the dynamics of many complex multi-step biochemical processes. In particular, these findings suggest not only that one may not be able to understand individual elementary reactions from macroscopic observations, but also that such understanding may be unnecessary

    Exponential Time Complexity of Weighted Counting of Independent Sets

    Full text link
    We consider weighted counting of independent sets using a rational weight x: Given a graph with n vertices, count its independent sets such that each set of size k contributes x^k. This is equivalent to computation of the partition function of the lattice gas with hard-core self-repulsion and hard-core pair interaction. We show the following conditional lower bounds: If counting the satisfying assignments of a 3-CNF formula in n variables (#3SAT) needs time 2^{\Omega(n)} (i.e. there is a c>0 such that no algorithm can solve #3SAT in time 2^{cn}), counting the independent sets of size n/3 of an n-vertex graph needs time 2^{\Omega(n)} and weighted counting of independent sets needs time 2^{\Omega(n/log^3 n)} for all rational weights x\neq 0. We have two technical ingredients: The first is a reduction from 3SAT to independent sets that preserves the number of solutions and increases the instance size only by a constant factor. Second, we devise a combination of vertex cloning and path addition. This graph transformation allows us to adapt a recent technique by Dell, Husfeldt, and Wahlen which enables interpolation by a family of reductions, each of which increases the instance size only polylogarithmically.Comment: Introduction revised, differences between versions of counting independent sets stated more precisely, minor improvements. 14 page

    -Generic Computability, Turing Reducibility and Asymptotic Density

    Full text link
    Generic computability has been studied in group theory and we now study it in the context of classical computability theory. A set A of natural numbers is generically computable if there is a partial computable function f whose domain has density 1 and which agrees with the characteristic function of A on its domain. A set A is coarsely computable if there is a computable set C such that the symmetric difference of A and C has density 0. We prove that there is a c.e. set which is generically computable but not coarsely computable and vice versa. We show that every nonzero Turing degree contains a set which is not coarsely computable. We prove that there is a c.e. set of density 1 which has no computable subset of density 1. As a corollary, there is a generically computable set A such that no generic algorithm for A has computable domain. We define a general notion of generic reducibility in the spirt of Turing reducibility and show that there is a natural order-preserving embedding of the Turing degrees into the generic degrees which is not surjective

    An exponential lower bound for Individualization-Refinement algorithms for Graph Isomorphism

    Full text link
    The individualization-refinement paradigm provides a strong toolbox for testing isomorphism of two graphs and indeed, the currently fastest implementations of isomorphism solvers all follow this approach. While these solvers are fast in practice, from a theoretical point of view, no general lower bounds concerning the worst case complexity of these tools are known. In fact, it is an open question whether individualization-refinement algorithms can achieve upper bounds on the running time similar to the more theoretical techniques based on a group theoretic approach. In this work we give a negative answer to this question and construct a family of graphs on which algorithms based on the individualization-refinement paradigm require exponential time. Contrary to a previous construction of Miyazaki, that only applies to a specific implementation within the individualization-refinement framework, our construction is immune to changing the cell selector, or adding various heuristic invariants to the algorithm. Furthermore, our graphs also provide exponential lower bounds in the case when the kk-dimensional Weisfeiler-Leman algorithm is used to replace the standard color refinement operator and the arguments even work when the entire automorphism group of the inputs is initially provided to the algorithm.Comment: 21 page

    Deciphering infant mortality. Part 1: empirical evidence

    Full text link
    This paper is not (or at least not only) about human infant mortality. In line with reliability theory, "infant" will refer here to the time interval following birth during which the mortality (or failure) rate decreases. This definition provides a systems science perspective in which birth constitutes a sudden transition which falls within the field of application of the "Transient Shock" (TS) conjecture put forward in Richmond et al. (2016c). This conjecture provides predictions about the timing and shape of the death rate peak. (i) It says that there will be a death rate spike whenever external conditions change abruptly and drastically. (ii) It predicts that after a steep rising there will be a much longer hyperbolic relaxation process. These predictions can be tested by considering living organisms for which birth is a multi-step process. Thus, for fish there are three states: egg, yolk-sac phase, young adult. The TS conjecture predicts a mortality spike at the end of the yolk-sac phase, and this timing is indeed confirmed by observation. Secondly, the hyperbolic nature of the relaxation process can be tested using high accuracy Swiss statistics which give postnatal death rates from one hour after birth up to the age of 10 years. It turns out that since the 19th century despite a great overall reduction in infant mortality, the shape of the age-specific death rate has remained basically unchanged. This hyperbolic pattern is not specific to humans. It can also be found in small primates as recorded in the archives of zoological gardens. Our ultimate objective is to set up a chain of cases which starts from simple systems and then moves up step by step to more complex organisms. The cases discussed here can be seen as initial landmarks.Comment: 46 pages, 14 figures, 4 table
    • …
    corecore