3,230 research outputs found

    An Exponential Lower Bound on the Complexity of Regularization Paths

    Full text link
    For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least \Theta(2^{n/2}) = \Theta(2^d) many distinct subsets of support vectors occur as the regularization parameter changes.Comment: Journal version, 28 Pages, 5 Figure

    Parameterized Complexity Analysis of Randomized Search Heuristics

    Full text link
    This chapter compiles a number of results that apply the theory of parameterized algorithmics to the running-time analysis of randomized search heuristics such as evolutionary algorithms. The parameterized approach articulates the running time of algorithms solving combinatorial problems in finer detail than traditional approaches from classical complexity theory. We outline the main results and proof techniques for a collection of randomized search heuristics tasked to solve NP-hard combinatorial optimization problems such as finding a minimum vertex cover in a graph, finding a maximum leaf spanning tree in a graph, and the traveling salesperson problem.Comment: This is a preliminary version of a chapter in the book "Theory of Evolutionary Computation: Recent Developments in Discrete Optimization", edited by Benjamin Doerr and Frank Neumann, published by Springe

    Interior-point methods for Pāˆ—(Īŗ)-linear complementarity problem based on generalized trigonometric barrier function

    Get PDF
    Recently, M.~Bouafoa, et al. investigated a new kernel function which differs from the self-regular kernel functions. The kernel function has a trigonometric Barrier Term. In this paper we generalize the analysis presented in the above paper for Pāˆ—(Īŗ)P_{*}(\kappa) Linear Complementarity Problems (LCPs). It is shown that the iteration bound for primal-dual large-update and small-update interior-point methods based on this function is as good as the currently best known iteration bounds for these type methods. The analysis for LCPs deviates significantly from the analysis for linear optimization. Several new tools and techniques are derived in this paper.publishedVersio
    • ā€¦
    corecore