47,618 research outputs found

    Separating decision tree complexity from subcube partition complexity

    Get PDF
    The subcube partition model of computation is at least as powerful as decision trees but no separation between these models was known. We show that there exists a function whose deterministic subcube partition complexity is asymptotically smaller than its randomized decision tree complexity, resolving an open problem of Friedgut, Kahn, and Wigderson (2002). Our lower bound is based on the information-theoretic techniques first introduced to lower bound the randomized decision tree complexity of the recursive majority function. We also show that the public-coin partition bound, the best known lower bound method for randomized decision tree complexity subsuming other general techniques such as block sensitivity, approximate degree, randomized certificate complexity, and the classical adversary bound, also lower bounds randomized subcube partition complexity. This shows that all these lower bound techniques cannot prove optimal lower bounds for randomized decision tree complexity, which answers an open question of Jain and Klauck (2010) and Jain, Lee, and Vishnoi (2014).Comment: 16 pages, 1 figur

    Lower Bounds on the Oracle Complexity of Nonsmooth Convex Optimization via Information Theory

    Full text link
    We present an information-theoretic approach to lower bound the oracle complexity of nonsmooth black box convex optimization, unifying previous lower bounding techniques by identifying a combinatorial problem, namely string guessing, as a single source of hardness. As a measure of complexity we use distributional oracle complexity, which subsumes randomized oracle complexity as well as worst-case oracle complexity. We obtain strong lower bounds on distributional oracle complexity for the box [1,1]n[-1,1]^n, as well as for the LpL^p-ball for p1p \geq 1 (for both low-scale and large-scale regimes), matching worst-case upper bounds, and hence we close the gap between distributional complexity, and in particular, randomized complexity, and worst-case complexity. Furthermore, the bounds remain essentially the same for high-probability and bounded-error oracle complexity, and even for combination of the two, i.e., bounded-error high-probability oracle complexity. This considerably extends the applicability of known bounds

    Randomized and Quantum Algorithms Yield a Speed-Up for Initial-Value Problems

    Get PDF
    Quantum algorithms and complexity have recently been studied not only for discrete, but also for some numerical problems. Most attention has been paid so far to the integration problem, for which a speed-up is shown by quantum computers with respect to deterministic and randomized algorithms on a classical computer. In this paper we deal with the randomized and quantum complexity of initial-value problems. For this nonlinear problem, we show that both randomized and quantum algorithms yield a speed-up over deterministic algorithms. Upper bounds on the complexity in the randomized and quantum settings are shown by constructing algorithms with a suitable cost, where the construction is based on integral information. Lower bounds result from the respective bounds for the integration problem.Comment: LaTeX v. 2.09, 13 page

    Almost Optimal Solution of Initial-Value Problems by Randomized and Quantum Algorithms

    Get PDF
    We establish essentially optimal bounds on the complexity of initial-value problems in the randomized and quantum settings. For this purpose we define a sequence of new algorithms whose error/cost properties improve from step to step. These algorithms yield new upper complexity bounds, which differ from known lower bounds by only an arbitrarily small positive parameter in the exponent, and a logarithmic factor. In both the randomized and quantum settings, initial-value problems turn out to be essentially as difficult as scalar integration.Comment: 16 pages, minor presentation change

    The Partition Bound for Classical Communication Complexity and Query Complexity

    Full text link
    We describe new lower bounds for randomized communication complexity and query complexity which we call the partition bounds. They are expressed as the optimum value of linear programs. For communication complexity we show that the partition bound is stronger than both the rectangle/corruption bound and the \gamma_2/generalized discrepancy bounds. In the model of query complexity we show that the partition bound is stronger than the approximate polynomial degree and classical adversary bounds. We also exhibit an example where the partition bound is quadratically larger than polynomial degree and classical adversary bounds.Comment: 28 pages, ver. 2, added conten

    Optimal randomized multilevel algorithms for infinite-dimensional integration on function spaces with ANOVA-type decomposition

    Full text link
    In this paper, we consider the infinite-dimensional integration problem on weighted reproducing kernel Hilbert spaces with norms induced by an underlying function space decomposition of ANOVA-type. The weights model the relative importance of different groups of variables. We present new randomized multilevel algorithms to tackle this integration problem and prove upper bounds for their randomized error. Furthermore, we provide in this setting the first non-trivial lower error bounds for general randomized algorithms, which, in particular, may be adaptive or non-linear. These lower bounds show that our multilevel algorithms are optimal. Our analysis refines and extends the analysis provided in [F. J. Hickernell, T. M\"uller-Gronbach, B. Niu, K. Ritter, J. Complexity 26 (2010), 229-254], and our error bounds improve substantially on the error bounds presented there. As an illustrative example, we discuss the unanchored Sobolev space and employ randomized quasi-Monte Carlo multilevel algorithms based on scrambled polynomial lattice rules.Comment: 31 pages, 0 figure

    Improved Bounds on the Randomized and Quantum Complexity of Initial-Value Problems

    Get PDF
    We deal with the problem, initiated in [8], of finding randomized and quantum complexity of initial-value problems. We showed in [8] that a speed-up in both settings over the worst-case deterministic complexity is possible. In the present paper we prove, by defining new algorithms, that further improvement in upper bounds on the randomized and quantum complexity can be achieved. In the H\"older class of right-hand side functions with r continuous bounded partial derivatives, with r-th derivative being a H\"older function with exponent \rho, the \epsilon-complexity is shown to be O((1/\epsilon)^{1/(r+\rho+1/3)}) in the randomized setting, and O((1/\epsilon)^{1/(r+\rho+1/2)}) on a quantum computer (up to logarithmic factors). This is an improvement for the general problem over the results from [8]. The gap still remaining between upper and lower bounds on the complexity is further discussed for a special problem. We consider scalar autonomous problems, with the aim of computing the solution at the end point of the interval of integration. For this problem, we fill up the gap by establishing (essentially) matching upper and lower complexity bounds. We show that the complexity in this case is of order (1/\epsilon)^{1/(r+\rho+1/2)} in the randomized setting, and (1/\epsilon)^{1/(r+\rho+1)} in the quantum setting (again up to logarithmic factors).Comment: 17 pages, extended version (new section added), to appear in the Journal of Complexit
    corecore