470 research outputs found

    Universal Algorithms: Beyond the Simplex

    Full text link
    The bulk of universal algorithms in the online convex optimisation literature are variants of the Hedge (exponential weights) algorithm on the simplex. While these algorithms extend to polytope domains by assigning weights to the vertices, this process is computationally unfeasible for many important classes of polytopes where the number VV of vertices depends exponentially on the dimension dd. In this paper we show the Subgradient algorithm is universal, meaning it has O(N)O(\sqrt N) regret in the antagonistic setting and O(1)O(1) pseudo-regret in the i.i.d setting, with two main advantages over Hedge: (1) The update step is more efficient as the action vectors have length only dd rather than VV; and (2) Subgradient gives better performance if the cost vectors satisfy Euclidean rather than sup-norm bounds. This paper extends the authors' recent results for Subgradient on the simplex. We also prove the same O(N)O(\sqrt N) and O(1)O(1) bounds when the domain is the unit ball. To the authors' knowledge this is the first instance of these bounds on a domain other than a polytope.Comment: 1 figure, 40 page

    Universal Algorithms for Clustering Problems

    Get PDF
    This paper presents universal algorithms for clustering problems, including the widely studied kk-median, kk-means, and kk-center objectives. The input is a metric space containing all potential client locations. The algorithm must select kk cluster centers such that they are a good solution for any subset of clients that actually realize. Specifically, we aim for low regret, defined as the maximum over all subsets of the difference between the cost of the algorithm's solution and that of an optimal solution. A universal algorithm's solution SOLSOL for a clustering problem is said to be an (α,β)(\alpha, \beta)-approximation if for all subsets of clients C′C', it satisfies SOL(C′)≤α⋅OPT(C′)+β⋅MRSOL(C') \leq \alpha \cdot OPT(C') + \beta \cdot MR, where OPT(C′)OPT(C') is the cost of the optimal solution for clients C′C' and MRMR is the minimum regret achievable by any solution. Our main results are universal algorithms for the standard clustering objectives of kk-median, kk-means, and kk-center that achieve (O(1),O(1))(O(1), O(1))-approximations. These results are obtained via a novel framework for universal algorithms using linear programming (LP) relaxations. These results generalize to other ℓp\ell_p-objectives and the setting where some subset of the clients are fixed. We also give hardness results showing that (α,β)(\alpha, \beta)-approximation is NP-hard if α\alpha or β\beta is at most a certain constant, even for the widely studied special case of Euclidean metric spaces. This shows that in some sense, (O(1),O(1))(O(1), O(1))-approximation is the strongest type of guarantee obtainable for universal clustering.Comment: Appeared in ICALP 2021, Track A. Fixed mismatch between paper title and arXiv titl

    Optimal Lower Bounds for Universal and Differentially Private Steiner Tree and TSP

    Get PDF
    Given a metric space on n points, an {\alpha}-approximate universal algorithm for the Steiner tree problem outputs a distribution over rooted spanning trees such that for any subset X of vertices containing the root, the expected cost of the induced subtree is within an {\alpha} factor of the optimal Steiner tree cost for X. An {\alpha}-approximate differentially private algorithm for the Steiner tree problem takes as input a subset X of vertices, and outputs a tree distribution that induces a solution within an {\alpha} factor of the optimal as before, and satisfies the additional property that for any set X' that differs in a single vertex from X, the tree distributions for X and X' are "close" to each other. Universal and differentially private algorithms for TSP are defined similarly. An {\alpha}-approximate universal algorithm for the Steiner tree problem or TSP is also an {\alpha}-approximate differentially private algorithm. It is known that both problems admit O(logn)-approximate universal algorithms, and hence O(log n)-approximate differentially private algorithms as well. We prove an {\Omega}(logn) lower bound on the approximation ratio achievable for the universal Steiner tree problem and the universal TSP, matching the known upper bounds. Our lower bound for the Steiner tree problem holds even when the algorithm is allowed to output a more general solution of a distribution on paths to the root.Comment: 14 page
    • …
    corecore