10 research outputs found

    LP-based Covering Games with Low Price of Anarchy

    Full text link
    We present a new class of vertex cover and set cover games. The price of anarchy bounds match the best known constant factor approximation guarantees for the centralized optimization problems for linear and also for submodular costs -- in contrast to all previously studied covering games, where the price of anarchy cannot be bounded by a constant (e.g. [6, 7, 11, 5, 2]). In particular, we describe a vertex cover game with a price of anarchy of 2. The rules of the games capture the structure of the linear programming relaxations of the underlying optimization problems, and our bounds are established by analyzing these relaxations. Furthermore, for linear costs we exhibit linear time best response dynamics that converge to these almost optimal Nash equilibria. These dynamics mimic the classical greedy approximation algorithm of Bar-Yehuda and Even [3]

    Distributed and Parallel Algorithms for Set Cover Problems with Small Neighborhood Covers

    Get PDF
    In this paper, we study a class of set cover problems that satisfy a special property which we call the {\em small neighborhood cover} property. This class encompasses several well-studied problems including vertex cover, interval cover, bag interval cover and tree cover. We design unified distributed and parallel algorithms that can handle any set cover problem falling under the above framework and yield constant factor approximations. These algorithms run in polylogarithmic communication rounds in the distributed setting and are in NC, in the parallel setting.Comment: Full version of FSTTCS'13 pape

    Partial information spreading with application to distributed maximum coverage

    Full text link

    Identifying the parametric occurrence of multiple steady states for some biological networks

    Get PDF
    We consider a problem from biological network analysis of determining regions in a parameter space over which there are multiple steady states for positive real values of variables and parameters. We describe multiple approaches to address the problem using tools from Symbolic Computation. We describe how progress was made to achieve semi-algebraic descriptions of the multistationarity regions of parameter space, and compare symbolic results to numerical methods. The biological networks studied are models of the mitogen-activated protein kinases (MAPK) network which has already consumed considerable effort using special insights into its structure of corresponding models. Our main example is a model with 11 equations in 11 variables and 19 parameters, 3 of which are of interest for symbolic treatment. The model also imposes positivity conditions on all variables and parameters. We apply combinations of symbolic computation methods designed for mixed equality/inequality systems, specifically virtual substitution, lazy real triangularization and cylindrical algebraic decomposition, as well as a simplification technique adapted from Gaussian elimination and graph theory. We are able to determine multistationarity of our main example over a 2-dimensional parameter space. We also study a second MAPK model and a symbolic grid sampling technique which can locate such regions in 3-dimensional parameter space.Comment: 60 pages - author preprint. Accepted in the Journal of Symbolic Computatio

    Distributed Weighted Vertex Cover via Maximal Matchings

    No full text
    In this paper we consider the problem of computing a minimumweight vertex-cover in an n-node, weighted, undirected graph G = (V,E). We present a fully distributed algorithm for computing vertex covers of weight at most twice the optimum, in the case of integer weights. Our algorithm runs in an expected number of O(logn + log ˆW) communication rounds, where ˆW is the average vertex-weight. The previous best algorithm for this problem requires O(logn(logn + log ˆW)) rounds and it is not fully distributed. For a maximal matching M in G it is a well-known fact that any vertex-cover in G needs to have at least |M | vertices. Our algorithm is based on a generalization of this combinatorial lower-bound to the weighted setting

    Distributed weighted vertex cover via maximal matchings

    No full text
    In this article, we consider the problem of computing a minimum-weight vertex-cover in an n-node, weighted, undirected graph G = (V,E). We present a fully distributed algorithm for computing vertex covers of weight at most twice the optimum, in the case of integer weights. Our algorithm runs in an expected number of O(log n + log Ŵ) communication rounds, where Ŵ is the average vertex-weight. The previous best algorithm for this problem requires O(log n(log n + logŴ)) rounds and it is not fully distributed. For a maximal matching M in G, it is a well-known fact that any vertex-cover in G needs to have at least M vertices. Our algorithm is based on a generalization of this combinatorial lower-bound to the weighted setting. © 2008 ACM

    Algorithmes d'approximation à mémoire limitée pour le traitement de grands graphes (le problème du Vertex Cover)

    Get PDF
    Nous nous sommes intéressés à un problème d'optimisation sur des graphes (le problème du Vertex Cover) dans un contexte bien particulier : celui des grandes instances de données. Nous avons défini un modèle de traitement se basant sur trois contraintes (en relation avec la quantité de mémoire limitée, par rapport à la grande masse de données à traiter) et qui reprenait des propriétés issus de plusieurs modèles existants. Nous avons étudié plusieurs algorithmes adaptés à ce modèle. Nous avons analysé, tout d'abord de façon théorique, la qualité de leurs solutions ainsi que leurs complexités. Nous avons ensuite mené une étude expérimentale sur de gros graphes. De manière générale, les travaux menés durant cette thèse peuvent fournir des indicateurs pour choisir le ou les algorithmes qui conviennent le mieux pour traiter le problème du vertex cover sur de gros graphes. Choisir un algorithme (qui plus est d'approximation) qui soit à la foisperformant (en terme de qualité de solution et de complexité) et qui satisfasse les contraintes du modèle que l'on considère est délicat. en effet, les algorithmes les plus performants ne sont pas toujours les mieux adaptés. dans les travaux que nous avons réalisés, nous sommes parvenus à la conclusion qu'il est préférable de choisir au départ l'algorithme qui est le mieux adapté plutôt que de choisir celui qui est le plus performant.We are interested to an optimization problem on graphs (the Vertex Cover problem) in a very specific context : the huge instances of data. We defined a treatment model based on three constraints (in connection with the limited amount of memory compared to the huge amount of data to be processed) and that reproduces properties from several existing models. We studied several algorithms adapted to this model. We examined, first theoretically, their solutions quality and their complexities. We then conducted an experimental study on large graphs. In general, the work made during this thesis may provide indicators for select algorithms that are best suited to resolve the Vertex Cover problem on large graphs. Choose an algorithm (which is approximated) that is both efficient (in terms of quality of solution and complexity) and satisfies the constraints model whether we consider is tricky. in fact, the most efficient algorithms are not always the best adapted. In the work we have done, we reached the conclusion that, at the beginning, it is best to choose the best suited algorithm rather than the more efficient.EVRY-Bib. électronique (912289901) / SudocSudocFranceF
    corecore