940 research outputs found

    Optimal Embeddings of Distance Regular Graphs into Euclidean Spaces

    Full text link
    In this paper we give a lower bound for the least distortion embedding of a distance regular graph into Euclidean space. We use the lower bound for finding the least distortion for Hamming graphs, Johnson graphs, and all strongly regular graphs. Our technique involves semidefinite programming and exploiting the algebra structure of the optimization problem so that the question of finding a lower bound of the least distortion is reduced to an analytic question about orthogonal polynomials.Comment: 10 pages, (v3) some corrections, accepted in Journal of Combinatorial Theory, Series

    On the optimality of gluing over scales

    Full text link
    We show that for every α>0\alpha > 0, there exist nn-point metric spaces (X,d) where every "scale" admits a Euclidean embedding with distortion at most α\alpha, but the whole space requires distortion at least Ω(αlogn)\Omega(\sqrt{\alpha \log n}). This shows that the scale-gluing lemma [Lee, SODA 2005] is tight, and disproves a conjecture stated there. This matching upper bound was known to be tight at both endpoints, i.e. when α=Θ(1)\alpha = \Theta(1) and α=Θ(logn)\alpha = \Theta(\log n), but nowhere in between. More specifically, we exhibit nn-point spaces with doubling constant λ\lambda requiring Euclidean distortion Ω(logλlogn)\Omega(\sqrt{\log \lambda \log n}), which also shows that the technique of "measured descent" [Krauthgamer, et. al., Geometric and Functional Analysis] is optimal. We extend this to obtain a similar tight result for LpL_p spaces with p>1p > 1.Comment: minor revision

    06481 Abstracts Collection -- Geometric Networks and Metric Space Embeddings

    Get PDF
    The Dagstuhl Seminar 06481 ``Geometric Networks and Metric Space Embeddings\u27\u27 was held from November~26 to December~1, 2006 in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. In this paper we describe the seminar topics, we have compiled a list of open questions that were posed during the seminar, there is a list of all talks and there are abstracts of the presentations given during the seminar. Links to extended abstracts or full papers are provided where available

    Small Transformers Compute Universal Metric Embeddings

    Full text link
    We study representations of data from an arbitrary metric space X\mathcal{X} in the space of univariate Gaussian mixtures with a transport metric (Delon and Desolneux 2020). We derive embedding guarantees for feature maps implemented by small neural networks called \emph{probabilistic transformers}. Our guarantees are of memorization type: we prove that a probabilistic transformer of depth about nlog(n)n\log(n) and width about n2n^2 can bi-H\"{o}lder embed any nn-point dataset from X\mathcal{X} with low metric distortion, thus avoiding the curse of dimensionality. We further derive probabilistic bi-Lipschitz guarantees, which trade off the amount of distortion and the probability that a randomly chosen pair of points embeds with that distortion. If X\mathcal{X}'s geometry is sufficiently regular, we obtain stronger, bi-Lipschitz guarantees for all points in the dataset. As applications, we derive neural embedding guarantees for datasets from Riemannian manifolds, metric trees, and certain types of combinatorial graphs. When instead embedding into multivariate Gaussian mixtures, we show that probabilistic transformers can compute bi-H\"{o}lder embeddings with arbitrarily small distortion.Comment: 42 pages, 10 Figures, 3 Table

    On metric Ramsey-type phenomena

    Full text link
    The main question studied in this article may be viewed as a nonlinear analogue of Dvoretzky's theorem in Banach space theory or as part of Ramsey theory in combinatorics. Given a finite metric space on n points, we seek its subspace of largest cardinality which can be embedded with a given distortion in Hilbert space. We provide nearly tight upper and lower bounds on the cardinality of this subspace in terms of n and the desired distortion. Our main theorem states that for any epsilon>0, every n point metric space contains a subset of size at least n^{1-\epsilon} which is embeddable in Hilbert space with O(\frac{\log(1/\epsilon)}{\epsilon}) distortion. The bound on the distortion is tight up to the log(1/\epsilon) factor. We further include a comprehensive study of various other aspects of this problem.Comment: 67 pages, published versio
    corecore