6,299 research outputs found

    Adversarial Smoothed Analysis

    Get PDF
    The purpose of this note is to extend the results on uniform smoothed analysis of condition numbers from \cite{BuCuLo:07} to the case where the perturbation follows a radially symmetric probability distribution. In particular, we will show that the bounds derived in \cite{BuCuLo:07} still hold in the case of distributions whose density has a singularity at the center of the perturbation, which we call {\em adversarial}.Comment: 8 page

    Smoothed Efficient Algorithms and Reductions for Network Coordination Games

    Get PDF
    Worst-case hardness results for most equilibrium computation problems have raised the need for beyond-worst-case analysis. To this end, we study the smoothed complexity of finding pure Nash equilibria in Network Coordination Games, a PLS-complete problem in the worst case. This is a potential game where the sequential-better-response algorithm is known to converge to a pure NE, albeit in exponential time. First, we prove polynomial (resp. quasi-polynomial) smoothed complexity when the underlying game graph is a complete (resp. arbitrary) graph, and every player has constantly many strategies. We note that the complete graph case is reminiscent of perturbing all parameters, a common assumption in most known smoothed analysis results. Second, we define a notion of smoothness-preserving reduction among search problems, and obtain reductions from 22-strategy network coordination games to local-max-cut, and from kk-strategy games (with arbitrary kk) to local-max-cut up to two flips. The former together with the recent result of [BCC18] gives an alternate O(n8)O(n^8)-time smoothed algorithm for the 22-strategy case. This notion of reduction allows for the extension of smoothed efficient algorithms from one problem to another. For the first set of results, we develop techniques to bound the probability that an (adversarial) better-response sequence makes slow improvements on the potential. Our approach combines and generalizes the local-max-cut approaches of [ER14,ABPW17] to handle the multi-strategy case: it requires a careful definition of the matrix which captures the increase in potential, a tighter union bound on adversarial sequences, and balancing it with good enough rank bounds. We believe that the approach and notions developed herein could be of interest in addressing the smoothed complexity of other potential and/or congestion games

    Improved Smoothed Analysis of 2-Opt for the Euclidean TSP

    Full text link
    The 2-opt heuristic is a simple local search heuristic for the Travelling Salesperson Problem (TSP). Although it usually performs well in practice, its worst-case running time is poor. Attempts to reconcile this difference have used smoothed analysis, in which adversarial instances are perturbed probabilistically. We are interested in the classical model of smoothed analysis for the Euclidean TSP, in which the perturbations are Gaussian. This model was previously used by Manthey \& Veenstra, who obtained smoothed complexity bounds polynomial in nn, the dimension dd, and the perturbation strength σ−1\sigma^{-1}. However, their analysis only works for d≥4d \geq 4. The only previous analysis for d≤3d \leq 3 was performed by Englert, R\"oglin \& V\"ocking, who used a different perturbation model which can be translated to Gaussian perturbations. Their model yields bounds polynomial in nn and σ−d\sigma^{-d}, and super-exponential in dd. As no direct analysis existed for Gaussian perturbations that yields polynomial bounds for all dd, we perform this missing analysis. Along the way, we improve all existing smoothed complexity bounds for Euclidean 2-opt.Comment: 31 pages, 3 figures. Accepted for presentation at ISAAC 202

    Smoothed Complexity Theory

    Get PDF
    Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and AvgP, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first hardness results (of bounded halting and tiling) and tractability results (binary optimization problems, graph coloring, satisfiability). Furthermore, we discuss extensions and shortcomings of our model and relate it to semi-random models.Comment: to be presented at MFCS 201
    • …
    corecore