research

The Value of Help Bits in Randomized and Average-Case Complexity

Abstract

"Help bits" are some limited trusted information about an instance or instances of a computational problem that may reduce the computational complexity of solving that instance or instances. In this paper, we study the value of help bits in the settings of randomized and average-case complexity. Amir, Beigel, and Gasarch (1990) show that for constant kk, if kk instances of a decision problem can be efficiently solved using less than kk bits of help, then the problem is in P/poly. We extend this result to the setting of randomized computation: We show that the decision problem is in P/poly if using β„“\ell help bits, kk instances of the problem can be efficiently solved with probability greater than 2β„“βˆ’k2^{\ell-k}. The same result holds if using less than k(1βˆ’h(Ξ±))k(1 - h(\alpha)) help bits (where h(β‹…)h(\cdot) is the binary entropy function), we can efficiently solve (1βˆ’Ξ±)(1-\alpha) fraction of the instances correctly with non-vanishing probability. We also extend these two results to non-constant but logarithmic kk. In this case however, instead of showing that the problem is in P/poly we show that it satisfies "kk-membership comparability," a notion known to be related to solving kk instances using less than kk bits of help. Next we consider the setting of average-case complexity: Assume that we can solve kk instances of a decision problem using some help bits whose entropy is less than kk when the kk instances are drawn independently from a particular distribution. Then we can efficiently solve an instance drawn from that distribution with probability better than 1/21/2. Finally, we show that in the case where kk is super-logarithmic, assuming kk-membership comparability of a decision problem, one cannot prove that the problem is in P/poly by a "black-box proof.

    Similar works

    Full text

    thumbnail-image

    Available Versions