4 research outputs found

    Further Approximations for Demand Matching: Matroid Constraints and Minor-Closed Graphs

    Get PDF
    We pursue a study of the Generalized Demand Matching problem, a common generalization of the b-Matching and Knapsack problems. Here, we are given a graph with vertex capacities, edge profits, and asymmetric demands on the edges. The goal is to find a maximum-profit subset of edges so the demands of chosen edges do not violate the vertex capacities. This problem is APX-hard and constant-factor approximations are already known. Our main results fall into two categories. First, using iterated relaxation and various filtering strategies, we show with an efficient rounding algorithm that if an additional matroid structure M is given and we further only allow sets that are independent in M, the natural LP relaxation has an integrality gap of at most 25/3. This can be further improved in various special cases, for example we improve over the 15-approximation for the previously- studied Coupled Placement problem [Korupolu et al. 2014] by giving a 7-approximation. Using similar techniques, we show the problem of computing a minimum-cost base in M satisfying vertex capacities admits a (1,3)-bicriteria approximation: the cost is at most the optimum and the capacities are violated by a factor of at most 3. This improves over the previous (1,4)-approximation in the special case that M is the graphic matroid over the given graph [Fukanaga and Nagamochi, 2009]. Second, we show Demand Matching admits a polynomial-time approximation scheme in graphs that exclude a fixed minor. If all demands are polynomially-bounded integers, this is somewhat easy using dynamic programming in bounded-treewidth graphs. Our main technical contribution is a sparsification lemma that allows us to scale the demands of some items to be used in a more intricate dynamic programming algorithm, followed by some randomized rounding to filter our scaled-demand solution to one whose original demands satisfy all constraints

    Pruning a Minimum Spanning Tree

    Full text link
    This work employs some techniques in order to filter random noise from the information provided by minimum spanning trees obtained from the correlation matrices of international stock market indices prior to and during times of crisis. The first technique establishes a threshold above which connections are considered affected by noise, based on the study of random networks with the same probability density distribution of the original data. The second technique is to judge the strengh of a connection by its survival rate, which is the amount of time a connection between two stock market indices endure. The idea is that true connections will survive for longer periods of time, and that random connections will not. That information is then combined with the information obtained from the first technique in order to create a smaller network, where most of the connections are either strong or enduring in time
    corecore