8,414 research outputs found

    Euclidean TSP with few inner points in linear space

    Full text link
    Given a set of nn points in the Euclidean plane, such that just kk points are strictly inside the convex hull of the whole set, we want to find the shortest tour visiting every point. The fastest known algorithm for the version when kk is significantly smaller than nn, i.e., when there are just few inner points, works in O(k11kk1.5n3)O(k^{11\sqrt{k}} k^{1.5} n^{3}) time [Knauer and Spillner, WG 2006], but also requires space of order kckn2k^{c\sqrt{k}}n^{2}. The best linear space algorithm takes O(k!kn)O(k! k n) time [Deineko, Hoffmann, Okamoto, Woeginer, Oper. Res. Lett. 34(1), 106-110]. We construct a linear space O(nk2+kO(k))O(nk^2+k^{O(\sqrt{k})}) time algorithm. The new insight is extending the known divide-and-conquer method based on planar separators with a matching-based argument to shrink the instance in every recursive call. This argument also shows that the problem admits a quadratic bikernel.Comment: under submissio

    Computing approximate PSD factorizations

    Get PDF
    We give an algorithm for computing approximate PSD factorizations of nonnegative matrices. The running time of the algorithm is polynomial in the dimensions of the input matrix, but exponential in the PSD rank and the approximation error. The main ingredient is an exact factorization algorithm when the rows and columns of the factors are constrained to lie in a general polyhedron. This strictly generalizes nonnegative matrix factorizations which can be captured by letting this polyhedron to be the nonnegative orthant.Comment: 10 page

    Learning with Clustering Structure

    Full text link
    We study supervised learning problems using clustering constraints to impose structure on either features or samples, seeking to help both prediction and interpretation. The problem of clustering features arises naturally in text classification for instance, to reduce dimensionality by grouping words together and identify synonyms. The sample clustering problem on the other hand, applies to multiclass problems where we are allowed to make multiple predictions and the performance of the best answer is recorded. We derive a unified optimization formulation highlighting the common structure of these problems and produce algorithms whose core iteration complexity amounts to a k-means clustering step, which can be approximated efficiently. We extend these results to combine sparsity and clustering constraints, and develop a new projection algorithm on the set of clustered sparse vectors. We prove convergence of our algorithms on random instances, based on a union of subspaces interpretation of the clustering structure. Finally, we test the robustness of our methods on artificial data sets as well as real data extracted from movie reviews.Comment: Completely rewritten. New convergence proofs in the clustered and sparse clustered case. New projection algorithm on sparse clustered vector

    On k-Convex Polygons

    Get PDF
    We introduce a notion of kk-convexity and explore polygons in the plane that have this property. Polygons which are \mbox{kk-convex} can be triangulated with fast yet simple algorithms. However, recognizing them in general is a 3SUM-hard problem. We give a characterization of \mbox{22-convex} polygons, a particularly interesting class, and show how to recognize them in \mbox{O(nlogn)O(n \log n)} time. A description of their shape is given as well, which leads to Erd\H{o}s-Szekeres type results regarding subconfigurations of their vertex sets. Finally, we introduce the concept of generalized geometric permutations, and show that their number can be exponential in the number of \mbox{22-convex} objects considered.Comment: 23 pages, 19 figure
    corecore