17,061 research outputs found

    Fat Polygonal Partitions with Applications to Visualization and Embeddings

    Get PDF
    Let T\mathcal{T} be a rooted and weighted tree, where the weight of any node is equal to the sum of the weights of its children. The popular Treemap algorithm visualizes such a tree as a hierarchical partition of a square into rectangles, where the area of the rectangle corresponding to any node in T\mathcal{T} is equal to the weight of that node. The aspect ratio of the rectangles in such a rectangular partition necessarily depends on the weights and can become arbitrarily high. We introduce a new hierarchical partition scheme, called a polygonal partition, which uses convex polygons rather than just rectangles. We present two methods for constructing polygonal partitions, both having guarantees on the worst-case aspect ratio of the constructed polygons; in particular, both methods guarantee a bound on the aspect ratio that is independent of the weights of the nodes. We also consider rectangular partitions with slack, where the areas of the rectangles may differ slightly from the weights of the corresponding nodes. We show that this makes it possible to obtain partitions with constant aspect ratio. This result generalizes to hyper-rectangular partitions in Rd\mathbb{R}^d. We use these partitions with slack for embedding ultrametrics into dd-dimensional Euclidean space: we give a polylog(Δ)\mathop{\rm polylog}(\Delta)-approximation algorithm for embedding nn-point ultrametrics into Rd\mathbb{R}^d with minimum distortion, where Δ\Delta denotes the spread of the metric, i.e., the ratio between the largest and the smallest distance between two points. The previously best-known approximation ratio for this problem was polynomial in nn. This is the first algorithm for embedding a non-trivial family of weighted-graph metrics into a space of constant dimension that achieves polylogarithmic approximation ratio.Comment: 26 page

    The Unreasonable Success of Local Search: Geometric Optimization

    Full text link
    What is the effectiveness of local search algorithms for geometric problems in the plane? We prove that local search with neighborhoods of magnitude 1/ϵc1/\epsilon^c is an approximation scheme for the following problems in the Euclidian plane: TSP with random inputs, Steiner tree with random inputs, facility location (with worst case inputs), and bicriteria kk-median (also with worst case inputs). The randomness assumption is necessary for TSP

    New Results for the MAP Problem in Bayesian Networks

    Full text link
    This paper presents new results for the (partial) maximum a posteriori (MAP) problem in Bayesian networks, which is the problem of querying the most probable state configuration of some of the network variables given evidence. First, it is demonstrated that the problem remains hard even in networks with very simple topology, such as binary polytrees and simple trees (including the Naive Bayes structure). Such proofs extend previous complexity results for the problem. Inapproximability results are also derived in the case of trees if the number of states per variable is not bounded. Although the problem is shown to be hard and inapproximable even in very simple scenarios, a new exact algorithm is described that is empirically fast in networks of bounded treewidth and bounded number of states per variable. The same algorithm is used as basis of a Fully Polynomial Time Approximation Scheme for MAP under such assumptions. Approximation schemes were generally thought to be impossible for this problem, but we show otherwise for classes of networks that are important in practice. The algorithms are extensively tested using some well-known networks as well as random generated cases to show their effectiveness.Comment: A couple of typos were fixed, as well as the notation in part of section 4, which was misleading. Theoretical and empirical results have not change

    A cost function for similarity-based hierarchical clustering

    Full text link
    The development of algorithms for hierarchical clustering has been hampered by a shortage of precise objective functions. To help address this situation, we introduce a simple cost function on hierarchies over a set of points, given pairwise similarities between those points. We show that this criterion behaves sensibly in canonical instances and that it admits a top-down construction procedure with a provably good approximation ratio
    • …
    corecore