5 research outputs found

    Distinct Distances in Graph Drawings

    Get PDF
    The \emph{distance-number} of a graph GG is the minimum number of distinct edge-lengths over all straight-line drawings of GG in the plane. This definition generalises many well-known concepts in combinatorial geometry. We consider the distance-number of trees, graphs with no K4K^-_4-minor, complete bipartite graphs, complete graphs, and cartesian products. Our main results concern the distance-number of graphs with bounded degree. We prove that nn-vertex graphs with bounded maximum degree and bounded treewidth have distance-number in O(logn)\mathcal{O}(\log n). To conclude such a logarithmic upper bound, both the degree and the treewidth need to be bounded. In particular, we construct graphs with treewidth 2 and polynomial distance-number. Similarly, we prove that there exist graphs with maximum degree 5 and arbitrarily large distance-number. Moreover, as Δ\Delta increases the existential lower bound on the distance-number of Δ\Delta-regular graphs tends to Ω(n0.864138)\Omega(n^{0.864138})

    Small Transformers Compute Universal Metric Embeddings

    Full text link
    We study representations of data from an arbitrary metric space X\mathcal{X} in the space of univariate Gaussian mixtures with a transport metric (Delon and Desolneux 2020). We derive embedding guarantees for feature maps implemented by small neural networks called \emph{probabilistic transformers}. Our guarantees are of memorization type: we prove that a probabilistic transformer of depth about nlog(n)n\log(n) and width about n2n^2 can bi-H\"{o}lder embed any nn-point dataset from X\mathcal{X} with low metric distortion, thus avoiding the curse of dimensionality. We further derive probabilistic bi-Lipschitz guarantees, which trade off the amount of distortion and the probability that a randomly chosen pair of points embeds with that distortion. If X\mathcal{X}'s geometry is sufficiently regular, we obtain stronger, bi-Lipschitz guarantees for all points in the dataset. As applications, we derive neural embedding guarantees for datasets from Riemannian manifolds, metric trees, and certain types of combinatorial graphs. When instead embedding into multivariate Gaussian mixtures, we show that probabilistic transformers can compute bi-H\"{o}lder embeddings with arbitrarily small distortion.Comment: 42 pages, 10 Figures, 3 Table
    corecore