1,597 research outputs found

    Sparse geometric graphs with small dilation

    Get PDF
    Given a set S of n points in R^D, and an integer k such that 0 <= k < n, we show that a geometric graph with vertex set S, at most n - 1 + k edges, maximum degree five, and dilation O(n / (k+1)) can be computed in time O(n log n). For any k, we also construct planar n-point sets for which any geometric graph with n-1+k edges has dilation Omega(n/(k+1)); a slightly weaker statement holds if the points of S are required to be in convex position

    Characterizing the impact of geometric properties of word embeddings on task performance

    Get PDF
    Analysis of word embedding properties to inform their use in downstream NLP tasks has largely been studied by assessing nearest neighbors. However, geometric properties of the continuous feature space contribute directly to the use of embedding features in downstream models, and are largely unexplored. We consider four properties of word embedding geometry, namely: position relative to the origin, distribution of features in the vector space, global pairwise distances, and local pairwise distances. We define a sequence of transformations to generate new embeddings that expose subsets of these properties to downstream models and evaluate change in task performance to understand the contribution of each property to NLP models. We transform publicly available pretrained embeddings from three popular toolkits (word2vec, GloVe, and FastText) and evaluate on a variety of intrinsic tasks, which model linguistic information in the vector space, and extrinsic tasks, which use vectors as input to machine learning models. We find that intrinsic evaluations are highly sensitive to absolute position, while extrinsic tasks rely primarily on local similarity. Our findings suggest that future embedding models and post-processing techniques should focus primarily on similarity to nearby points in vector space.Comment: Appearing in the Third Workshop on Evaluating Vector Space Representations for NLP (RepEval 2019). 7 pages + reference

    Communication tree problems

    Get PDF
    In this paper, we consider random communication requirements and several cost measures for a particular model of tree routing on a complete network. First we show that a random tree does not give any approximation. Then give approximation algorithms for the case for two random models of requirements.Postprint (published version

    On metric Ramsey-type phenomena

    Full text link
    The main question studied in this article may be viewed as a nonlinear analogue of Dvoretzky's theorem in Banach space theory or as part of Ramsey theory in combinatorics. Given a finite metric space on n points, we seek its subspace of largest cardinality which can be embedded with a given distortion in Hilbert space. We provide nearly tight upper and lower bounds on the cardinality of this subspace in terms of n and the desired distortion. Our main theorem states that for any epsilon>0, every n point metric space contains a subset of size at least n^{1-\epsilon} which is embeddable in Hilbert space with O(\frac{\log(1/\epsilon)}{\epsilon}) distortion. The bound on the distortion is tight up to the log(1/\epsilon) factor. We further include a comprehensive study of various other aspects of this problem.Comment: 67 pages, published versio
    corecore