29,172 research outputs found

    The parameterized complexity of some geometric problems in unbounded dimension

    Full text link
    We study the parameterized complexity of the following fundamental geometric problems with respect to the dimension dd: i) Given nn points in \Rd, compute their minimum enclosing cylinder. ii) Given two nn-point sets in \Rd, decide whether they can be separated by two hyperplanes. iii) Given a system of nn linear inequalities with dd variables, find a maximum-size feasible subsystem. We show that (the decision versions of) all these problems are W[1]-hard when parameterized by the dimension dd. %and hence not solvable in O(f(d)nc){O}(f(d)n^c) time, for any computable function ff and constant cc %(unless FPT=W[1]). Our reductions also give a nΩ(d)n^{\Omega(d)}-time lower bound (under the Exponential Time Hypothesis)

    Software Engineering and Complexity in Effective Algebraic Geometry

    Get PDF
    We introduce the notion of a robust parameterized arithmetic circuit for the evaluation of algebraic families of multivariate polynomials. Based on this notion, we present a computation model, adapted to Scientific Computing, which captures all known branching parsimonious symbolic algorithms in effective Algebraic Geometry. We justify this model by arguments from Software Engineering. Finally we exhibit a class of simple elimination problems of effective Algebraic Geometry which require exponential time to be solved by branching parsimonious algorithms of our computation model.Comment: 70 pages. arXiv admin note: substantial text overlap with arXiv:1201.434

    An Exponential Lower Bound on the Complexity of Regularization Paths

    Full text link
    For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least \Theta(2^{n/2}) = \Theta(2^d) many distinct subsets of support vectors occur as the regularization parameter changes.Comment: Journal version, 28 Pages, 5 Figure
    • …
    corecore