1,363 research outputs found

    Metric structures in L_1: Dimension, snowflakes, and average distortion

    Get PDF
    We study the metric properties of finite subsets of L_1. The analysis of such metrics is central to a number of important algorithmic problems involving the cut structure of weighted graphs, including the Sparsest Cut Problem, one of the most compelling open problems in the field of approximation algorithms. Additionally, many open questions in geometric non-linear functional analysis involve the properties of finite subsets of L_1.Comment: 9 pages, 1 figure. To appear in European Journal of Combinatorics. Preliminary version appeared in LATIN '0

    Metric embeddings with relaxed guarantees

    Get PDF
    We consider the problem of embedding finite metrics with slack: We seek to produce embeddings with small dimension and distortion while allowing a (small) constant fraction of all distances to be arbitrarily distorted. This definition is motivated by recent research in the networking community, which achieved striking empirical success at embedding Internet latencies with low distortion into low-dimensional Euclidean space, provided that some small slack is allowed. Answering an open question of Kleinberg, Slivkins, and Wexler [in Proceedings of the 45th IEEE Symposium on Foundations of Computer Science, 2004], we show that provable guarantees of this type can in fact be achieved in general: Any finite metric space can be embedded, with constant slack and constant distortion, into constant-dimensional Euclidean space. We then show that there exist stronger embeddings into l 1 which exhibit gracefully degrading distortion: There is a single embedding into l 1 that achieves distortion at most O (log 1/∈) on all but at most an ∈ fraction of distances simultaneously for all ∈ > 0. We extend this with distortion O (log 1/∈) 1/p to maps into general l p, p ≥ 1, for several classes of metrics, including those with bounded doubling dimension and those arising from the shortest-path metric of a graph with an excluded minor. Finally, we show that many of our constructions are tight and give a general technique to obtain lower bounds for ∈-slack embeddings from lower bounds for low-distortion embeddings. © 2009 Society for Industrial and Applied Mathematics.published_or_final_versio

    Measured descent: A new embedding method for finite metrics

    Full text link
    We devise a new embedding technique, which we call measured descent, based on decomposing a metric space locally, at varying speeds, according to the density of some probability measure. This provides a refined and unified framework for the two primary methods of constructing Frechet embeddings for finite metrics, due to [Bourgain, 1985] and [Rao, 1999]. We prove that any n-point metric space (X,d) embeds in Hilbert space with distortion O(sqrt{alpha_X log n}), where alpha_X is a geometric estimate on the decomposability of X. As an immediate corollary, we obtain an O(sqrt{(log lambda_X) \log n}) distortion embedding, where \lambda_X is the doubling constant of X. Since \lambda_X\le n, this result recovers Bourgain's theorem, but when the metric X is, in a sense, ``low-dimensional,'' improved bounds are achieved. Our embeddings are volume-respecting for subsets of arbitrary size. One consequence is the existence of (k, O(log n)) volume-respecting embeddings for all 1 \leq k \leq n, which is the best possible, and answers positively a question posed by U. Feige. Our techniques are also used to answer positively a question of Y. Rabinovich, showing that any weighted n-point planar graph embeds in l_\infty^{O(log n)} with O(1) distortion. The O(log n) bound on the dimension is optimal, and improves upon the previously known bound of O((log n)^2).Comment: 17 pages. No figures. Appeared in FOCS '04. To appeaer in Geometric & Functional Analysis. This version fixes a subtle error in Section 2.

    Near-Neighbor Preserving Dimension Reduction for Doubling Subsets of l_1

    Get PDF
    Randomized dimensionality reduction has been recognized as one of the fundamental techniques in handling high-dimensional data. Starting with the celebrated Johnson-Lindenstrauss Lemma, such reductions have been studied in depth for the Euclidean (l_2) metric, but much less for the Manhattan (l_1) metric. Our primary motivation is the approximate nearest neighbor problem in l_1. We exploit its reduction to the decision-with-witness version, called approximate near neighbor, which incurs a roughly logarithmic overhead. In 2007, Indyk and Naor, in the context of approximate nearest neighbors, introduced the notion of nearest neighbor-preserving embeddings. These are randomized embeddings between two metric spaces with guaranteed bounded distortion only for the distances between a query point and a point set. Such embeddings are known to exist for both l_2 and l_1 metrics, as well as for doubling subsets of l_2. The case that remained open were doubling subsets of l_1. In this paper, we propose a dimension reduction by means of a near neighbor-preserving embedding for doubling subsets of l_1. Our approach is to represent the pointset with a carefully chosen covering set, then randomly project the latter. We study two types of covering sets: c-approximate r-nets and randomly shifted grids, and we discuss the tradeoff between them in terms of preprocessing time and target dimension. We employ Cauchy variables: certain concentration bounds derived should be of independent interest

    Metric Embedding via Shortest Path Decompositions

    Full text link
    We study the problem of embedding shortest-path metrics of weighted graphs into p\ell_p spaces. We introduce a new embedding technique based on low-depth decompositions of a graph via shortest paths. The notion of Shortest Path Decomposition depth is inductively defined: A (weighed) path graph has shortest path decomposition (SPD) depth 11. General graph has an SPD of depth kk if it contains a shortest path whose deletion leads to a graph, each of whose components has SPD depth at most k1k-1. In this paper we give an O(kmin{1p,12})O(k^{\min\{\frac{1}{p},\frac{1}{2}\}})-distortion embedding for graphs of SPD depth at most kk. This result is asymptotically tight for any fixed p>1p>1, while for p=1p=1 it is tight up to second order terms. As a corollary of this result, we show that graphs having pathwidth kk embed into p\ell_p with distortion O(kmin{1p,12})O(k^{\min\{\frac{1}{p},\frac{1}{2}\}}). For p=1p=1, this improves over the best previous bound of Lee and Sidiropoulos that was exponential in kk; moreover, for other values of pp it gives the first embeddings whose distortion is independent of the graph size nn. Furthermore, we use the fact that planar graphs have SPD depth O(logn)O(\log n) to give a new proof that any planar graph embeds into 1\ell_1 with distortion O(logn)O(\sqrt{\log n}). Our approach also gives new results for graphs with bounded treewidth, and for graphs excluding a fixed minor
    corecore