19,509 research outputs found

    How many candidates are needed to make elections hard to manipulate?

    Full text link
    In multiagent settings where the agents have different preferences, preference aggregation is a central issue. Voting is a general method for preference aggregation, but seminal results have shown that all general voting protocols are manipulable. One could try to avoid manipulation by using voting protocols where determining a beneficial manipulation is hard computationally. The complexity of manipulating realistic elections where the number of candidates is a small constant was recently studied (Conitzer 2002), but the emphasis was on the question of whether or not a protocol becomes hard to manipulate for some constant number of candidates. That work, in many cases, left open the question: How many candidates are needed to make elections hard to manipulate? This is a crucial question when comparing the relative manipulability of different voting protocols. In this paper we answer that question for the voting protocols of the earlier study: plurality, Borda, STV, Copeland, maximin, regular cup, and randomized cup. We also answer that question for two voting protocols for which no results on the complexity of manipulation have been derived before: veto and plurality with runoff. It turns out that the voting protocols under study become hard to manipulate at 3 candidates, 4 candidates, 7 candidates, or never

    Simplest random K-satisfiability problem

    Full text link
    We study a simple and exactly solvable model for the generation of random satisfiability problems. These consist of γN\gamma N random boolean constraints which are to be satisfied simultaneously by NN logical variables. In statistical-mechanics language, the considered model can be seen as a diluted p-spin model at zero temperature. While such problems become extraordinarily hard to solve by local search methods in a large region of the parameter space, still at least one solution may be superimposed by construction. The statistical properties of the model can be studied exactly by the replica method and each single instance can be analyzed in polynomial time by a simple global solution method. The geometrical/topological structures responsible for dynamic and static phase transitions as well as for the onset of computational complexity in local search method are thoroughly analyzed. Numerical analysis on very large samples allows for a precise characterization of the critical scaling behaviour.Comment: 14 pages, 5 figures, to appear in Phys. Rev. E (Feb 2001). v2: minor errors and references correcte

    Statistical mechanics of the vertex-cover problem

    Full text link
    We review recent progress in the study of the vertex-cover problem (VC). VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits an coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping VC to a hard-core lattice gas, and then applying techniques like the replica trick or the cavity approach. Using these methods, the phase diagram of VC could be obtained exactly for connectivities c<ec<e, where VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c>ec>e, the solution of VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for VC. Finally, we describe recent results for VC when studied on other ensembles of finite- and infinite-dimensional graphs.Comment: review article, 26 pages, 9 figures, to appear in J. Phys. A: Math. Ge

    Boosting search by rare events

    Full text link
    Randomized search algorithms for hard combinatorial problems exhibit a large variability of performances. We study the different types of rare events which occur in such out-of-equilibrium stochastic processes and we show how they cooperate in determining the final distribution of running times. As a byproduct of our analysis we show how search algorithms are optimized by random restarts.Comment: 4 pages, 3 eps figures. References update

    Average-case Hardness of RIP Certification

    Get PDF
    The restricted isometry property (RIP) for design matrices gives guarantees for optimal recovery in sparse linear models. It is of high interest in compressed sensing and statistical learning. This property is particularly important for computationally efficient recovery methods. As a consequence, even though it is in general NP-hard to check that RIP holds, there have been substantial efforts to find tractable proxies for it. These would allow the construction of RIP matrices and the polynomial-time verification of RIP given an arbitrary matrix. We consider the framework of average-case certifiers, that never wrongly declare that a matrix is RIP, while being often correct for random instances. While there are such functions which are tractable in a suboptimal parameter regime, we show that this is a computationally hard task in any better regime. Our results are based on a new, weaker assumption on the problem of detecting dense subgraphs

    Variable neighbourhood search for the minimum labelling Steiner tree problem

    Get PDF
    We present a study on heuristic solution approaches to the minimum labelling Steiner tree problem, an NP-hard graph problem related to the minimum labelling spanning tree problem. Given an undirected labelled connected graph, the aim is to find a spanning tree covering a given subset of nodes of the graph, whose edges have the smallest number of distinct labels. Such a model may be used to represent many real world problems in telecommunications and multimodal transportation networks. Several metaheuristics are proposed and evaluated. The approaches are compared to the widely adopted Pilot Method and it is shown that the Variable Neighbourhood Search metaheuristic is the most effective approach to the problem, obtaining high quality solutions in short computational running times
    corecore