15,457 research outputs found

    The Value of Help Bits in Randomized and Average-Case Complexity

    Full text link
    "Help bits" are some limited trusted information about an instance or instances of a computational problem that may reduce the computational complexity of solving that instance or instances. In this paper, we study the value of help bits in the settings of randomized and average-case complexity. Amir, Beigel, and Gasarch (1990) show that for constant kk, if kk instances of a decision problem can be efficiently solved using less than kk bits of help, then the problem is in P/poly. We extend this result to the setting of randomized computation: We show that the decision problem is in P/poly if using β„“\ell help bits, kk instances of the problem can be efficiently solved with probability greater than 2β„“βˆ’k2^{\ell-k}. The same result holds if using less than k(1βˆ’h(Ξ±))k(1 - h(\alpha)) help bits (where h(β‹…)h(\cdot) is the binary entropy function), we can efficiently solve (1βˆ’Ξ±)(1-\alpha) fraction of the instances correctly with non-vanishing probability. We also extend these two results to non-constant but logarithmic kk. In this case however, instead of showing that the problem is in P/poly we show that it satisfies "kk-membership comparability," a notion known to be related to solving kk instances using less than kk bits of help. Next we consider the setting of average-case complexity: Assume that we can solve kk instances of a decision problem using some help bits whose entropy is less than kk when the kk instances are drawn independently from a particular distribution. Then we can efficiently solve an instance drawn from that distribution with probability better than 1/21/2. Finally, we show that in the case where kk is super-logarithmic, assuming kk-membership comparability of a decision problem, one cannot prove that the problem is in P/poly by a "black-box proof.

    Boolean Operations, Joins, and the Extended Low Hierarchy

    Get PDF
    We prove that the join of two sets may actually fall into a lower level of the extended low hierarchy than either of the sets. In particular, there exist sets that are not in the second level of the extended low hierarchy, EL_2, yet their join is in EL_2. That is, in terms of extended lowness, the join operator can lower complexity. Since in a strong intuitive sense the join does not lower complexity, our result suggests that the extended low hierarchy is unnatural as a complexity measure. We also study the closure properties of EL_ and prove that EL_2 is not closed under certain Boolean operations. To this end, we establish the first known (and optimal) EL_2 lower bounds for certain notions generalizing Selman's P-selectivity, which may be regarded as an interesting result in its own right.Comment: 12 page

    Bernoulli measure on strings, and Thompson-Higman monoids

    Full text link
    The Bernoulli measure on strings is used to define height functions for the dense R- and L-orders of the Thompson-Higman monoids M_{k,1}. The measure can also be used to characterize the D-relation of certain submonoids of M_{k,1}. The computational complexity of computing the Bernoulli measure of certain sets, and in particular, of computing the R- and L-height of an element of M_{k,1} is investigated.Comment: 27 pages

    Levelable Sets and the Algebraic Structure of Parameterizations

    Get PDF
    Asking which sets are fixed-parameter tractable for a given parameterization constitutes much of the current research in parameterized complexity theory. This approach faces some of the core difficulties in complexity theory. By focussing instead on the parameterizations that make a given set fixed-parameter tractable, we circumvent these difficulties. We isolate parameterizations as independent measures of complexity and study their underlying algebraic structure. Thus we are able to compare parameterizations, which establishes a hierarchy of complexity that is much stronger than that present in typical parameterized algorithms races. Among other results, we find that no practically fixed-parameter tractable sets have optimal parameterizations

    Computing Multidimensional Persistence

    Full text link
    The theory of multidimensional persistence captures the topology of a multifiltration -- a multiparameter family of increasing spaces. Multifiltrations arise naturally in the topological analysis of scientific data. In this paper, we give a polynomial time algorithm for computing multidimensional persistence. We recast this computation as a problem within computational algebraic geometry and utilize algorithms from this area to solve it. While the resulting problem is Expspace-complete and the standard algorithms take doubly-exponential time, we exploit the structure inherent withing multifiltrations to yield practical algorithms. We implement all algorithms in the paper and provide statistical experiments to demonstrate their feasibility.Comment: This paper has been withdrawn by the authors. Journal of Computational Geometry, 1(1) 2010, pages 72-100. http://jocg.org/index.php/jocg/article/view/1
    • …
    corecore