4 research outputs found

    Optimal Query Complexity for Reconstructing Hypergraphs

    Get PDF
    In this paper we consider the problem of reconstructing a hidden weighted hypergraph of constant rank using additive queries. We prove the following: Let GG be a weighted hidden hypergraph of constant rank with n vertices and mm hyperedges. For any mm there exists a non-adaptive algorithm that finds the edges of the graph and their weights using O(mlognlogm) O(\frac{m\log n}{\log m}) additive queries. This solves the open problem in [S. Choi, J. H. Kim. Optimal Query Complexity Bounds for Finding Graphs. {\em STOC}, 749--758,~2008]. When the weights of the hypergraph are integers that are less than O(poly(nd/m))O(poly(n^d/m)) where dd is the rank of the hypergraph (and therefore for unweighted hypergraphs) there exists a non-adaptive algorithm that finds the edges of the graph and their weights using O(mlogndmlogm). O(\frac{m\log \frac{n^d}{m}}{\log m}). additive queries. Using the information theoretic bound the above query complexities are tight

    Finding Weighted Graphs by Combinatorial Search

    Full text link
    We consider the problem of finding edges of a hidden weighted graph using a certain type of queries. Let GG be a weighted graph with nn vertices. In the most general setting, the nn vertices are known and no other information about GG is given. The problem is finding all edges of GG and their weights using additive queries, where, for an additive query, one chooses a set of vertices and asks the sum of the weights of edges with both ends in the set. This model has been extensively used in bioinformatics including genom sequencing. Extending recent results of Bshouty and Mazzawi, and Choi and Kim, we present a polynomial time randomized algorithm to find the hidden weighted graph GG when the number of edges in GG is known to be at most m2m\geq 2 and the weight w(e)w(e) of each edge ee satisfies \ga \leq |w(e)|\leq \gb for fixed constants \ga, \gb>0. The query complexity of the algorithm is O(mlognlogm)O(\frac{m \log n}{\log m}), which is optimal up to a constant factor
    corecore