6,280 research outputs found

    The Limitations of Optimization from Samples

    Full text link
    In this paper we consider the following question: can we optimize objective functions from the training data we use to learn them? We formalize this question through a novel framework we call optimization from samples (OPS). In OPS, we are given sampled values of a function drawn from some distribution and the objective is to optimize the function under some constraint. While there are interesting classes of functions that can be optimized from samples, our main result is an impossibility. We show that there are classes of functions which are statistically learnable and optimizable, but for which no reasonable approximation for optimization from samples is achievable. In particular, our main result shows that there is no constant factor approximation for maximizing coverage functions under a cardinality constraint using polynomially-many samples drawn from any distribution. We also show tight approximation guarantees for maximization under a cardinality constraint of several interesting classes of functions including unit-demand, additive, and general monotone submodular functions, as well as a constant factor approximation for monotone submodular functions with bounded curvature

    Towards a better approximation for sparsest cut?

    Full text link
    We give a new (1+ϵ)(1+\epsilon)-approximation for sparsest cut problem on graphs where small sets expand significantly more than the sparsest cut (sets of size n/rn/r expand by a factor lognlogr\sqrt{\log n\log r} bigger, for some small rr; this condition holds for many natural graph families). We give two different algorithms. One involves Guruswami-Sinop rounding on the level-rr Lasserre relaxation. The other is combinatorial and involves a new notion called {\em Small Set Expander Flows} (inspired by the {\em expander flows} of ARV) which we show exists in the input graph. Both algorithms run in time 2O(r)poly(n)2^{O(r)} \mathrm{poly}(n). We also show similar approximation algorithms in graphs with genus gg with an analogous local expansion condition. This is the first algorithm we know of that achieves (1+ϵ)(1+\epsilon)-approximation on such general family of graphs

    Packing Returning Secretaries

    Full text link
    We study online secretary problems with returns in combinatorial packing domains with nn candidates that arrive sequentially over time in random order. The goal is to accept a feasible packing of candidates of maximum total value. In the first variant, each candidate arrives exactly twice. All 2n2n arrivals occur in random order. We propose a simple 0.5-competitive algorithm that can be combined with arbitrary approximation algorithms for the packing domain, even when the total value of candidates is a subadditive function. For bipartite matching, we obtain an algorithm with competitive ratio at least 0.5721o(1)0.5721 - o(1) for growing nn, and an algorithm with ratio at least 0.54590.5459 for all n1n \ge 1. We extend all algorithms and ratios to k2k \ge 2 arrivals per candidate. In the second variant, there is a pool of undecided candidates. In each round, a random candidate from the pool arrives. Upon arrival a candidate can be either decided (accept/reject) or postponed (returned into the pool). We mainly focus on minimizing the expected number of postponements when computing an optimal solution. An expected number of Θ(nlogn)\Theta(n \log n) is always sufficient. For matroids, we show that the expected number can be reduced to O(rlog(n/r))O(r \log (n/r)), where rn/2r \le n/2 is the minimum of the ranks of matroid and dual matroid. For bipartite matching, we show a bound of O(rlogn)O(r \log n), where rr is the size of the optimum matching. For general packing, we show a lower bound of Ω(nloglogn)\Omega(n \log \log n), even when the size of the optimum is r=Θ(logn)r = \Theta(\log n).Comment: 23 pages, 5 figure

    A physicist's approach to number partitioning

    Get PDF
    The statistical physics approach to the number partioning problem, a classical NP-hard problem, is both simple and rewarding. Very basic notions and methods from statistical mechanics are enough to obtain analytical results for the phase boundary that separates the ``easy-to-solve'' from the ``hard-to-solve'' phase of the NPP as well as for the probability distributions of the optimal and sub-optimal solutions. In addition, it can be shown that solving a number partioning problem of size NN to some extent corresponds to locating the minimum in an unsorted list of \bigo{2^N} numbers. Considering this correspondence it is not surprising that known heuristics for the partitioning problem are not significantly better than simple random search.Comment: 35 pages, to appear in J. Theor. Comp. Science, typo corrected in eq.1

    Statistical mechanics of the vertex-cover problem

    Full text link
    We review recent progress in the study of the vertex-cover problem (VC). VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits an coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping VC to a hard-core lattice gas, and then applying techniques like the replica trick or the cavity approach. Using these methods, the phase diagram of VC could be obtained exactly for connectivities c<ec<e, where VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c>ec>e, the solution of VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for VC. Finally, we describe recent results for VC when studied on other ensembles of finite- and infinite-dimensional graphs.Comment: review article, 26 pages, 9 figures, to appear in J. Phys. A: Math. Ge

    Artificial immune systems can find arbitrarily good approximations for the NP-hard number partitioning problem

    Get PDF
    Typical artificial immune system (AIS) operators such as hypermutations with mutation potential and ageing allow to efficiently overcome local optima from which evolutionary algorithms (EAs) struggle to escape. Such behaviour has been shown for artificial example functions constructed especially to show difficulties that EAs may encounter during the optimisation process. However, no evidence is available indicating that these two operators have similar behaviour also in more realistic problems. In this paper we perform an analysis for the standard NP-hard Partition problem from combinatorial optimisation and rigorously show that hypermutations and ageing allow AISs to efficiently escape from local optima where standard EAs require exponential time. As a result we prove that while EAs and random local search (RLS) may get trapped on 4/3 approximations, AISs find arbitrarily good approximate solutions of ratio (1+) within n(−(2/)−1)(1 − )−2e322/ + 2n322/ + 2n3 function evaluations in expectation. This expectation is polynomial in the problem size and exponential only in 1/
    corecore