2,881 research outputs found
Sparse geometric graphs with small dilation
Given a set S of n points in R^D, and an integer k such that 0 <= k < n, we
show that a geometric graph with vertex set S, at most n - 1 + k edges, maximum
degree five, and dilation O(n / (k+1)) can be computed in time O(n log n). For
any k, we also construct planar n-point sets for which any geometric graph with
n-1+k edges has dilation Omega(n/(k+1)); a slightly weaker statement holds if
the points of S are required to be in convex position
Computing a Minimum-Dilation Spanning Tree is NP-hard
In a geometric network G = (S, E), the graph distance between two vertices u,
v in S is the length of the shortest path in G connecting u to v. The dilation
of G is the maximum factor by which the graph distance of a pair of vertices
differs from their Euclidean distance. We show that given a set S of n points
with integer coordinates in the plane and a rational dilation delta > 1, it is
NP-hard to determine whether a spanning tree of S with dilation at most delta
exists
Lower bounds on the dilation of plane spanners
(I) We exhibit a set of 23 points in the plane that has dilation at least
, improving the previously best lower bound of for the
worst-case dilation of plane spanners.
(II) For every integer , there exists an -element point set
such that the degree 3 dilation of denoted by in the domain of plane geometric spanners. In the
same domain, we show that for every integer , there exists a an
-element point set such that the degree 4 dilation of denoted by
The
previous best lower bound of holds for any degree.
(III) For every integer , there exists an -element point set
such that the stretch factor of the greedy triangulation of is at least
.Comment: Revised definitions in the introduction; 23 pages, 15 figures; 2
table
Characterizing the impact of geometric properties of word embeddings on task performance
Analysis of word embedding properties to inform their use in downstream NLP
tasks has largely been studied by assessing nearest neighbors. However,
geometric properties of the continuous feature space contribute directly to the
use of embedding features in downstream models, and are largely unexplored. We
consider four properties of word embedding geometry, namely: position relative
to the origin, distribution of features in the vector space, global pairwise
distances, and local pairwise distances. We define a sequence of
transformations to generate new embeddings that expose subsets of these
properties to downstream models and evaluate change in task performance to
understand the contribution of each property to NLP models. We transform
publicly available pretrained embeddings from three popular toolkits (word2vec,
GloVe, and FastText) and evaluate on a variety of intrinsic tasks, which model
linguistic information in the vector space, and extrinsic tasks, which use
vectors as input to machine learning models. We find that intrinsic evaluations
are highly sensitive to absolute position, while extrinsic tasks rely primarily
on local similarity. Our findings suggest that future embedding models and
post-processing techniques should focus primarily on similarity to nearby
points in vector space.Comment: Appearing in the Third Workshop on Evaluating Vector Space
Representations for NLP (RepEval 2019). 7 pages + reference
CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters
The rise of graph-structured data such as social networks, regulatory
networks, citation graphs, and functional brain networks, in combination with
resounding success of deep learning in various applications, has brought the
interest in generalizing deep learning models to non-Euclidean domains. In this
paper, we introduce a new spectral domain convolutional architecture for deep
learning on graphs. The core ingredient of our model is a new class of
parametric rational complex functions (Cayley polynomials) allowing to
efficiently compute spectral filters on graphs that specialize on frequency
bands of interest. Our model generates rich spectral filters that are localized
in space, scales linearly with the size of the input data for
sparsely-connected graphs, and can handle different constructions of Laplacian
operators. Extensive experimental results show the superior performance of our
approach, in comparison to other spectral domain convolutional architectures,
on spectral image classification, community detection, vertex classification
and matrix completion tasks
Communication tree problems
In this paper, we consider random communication
requirements and several cost
measures for a particular model of tree routing on a
complete network. First
we show that a random tree does not give any approximation.
Then give
approximation algorithms for the case for two random models
of requirements.Postprint (published version
Oriented Spanners
Given a point set P in the Euclidean plane and a parameter t, we define an oriented t-spanner as an oriented subgraph of the complete bi-directed graph such that for every pair of points, the shortest cycle in G through those points is at most a factor t longer than the shortest oriented cycle in the complete bi-directed graph. We investigate the problem of computing sparse graphs with small oriented dilation.
As we can show that minimising oriented dilation for a given number of edges is NP-hard in the plane, we first consider one-dimensional point sets. While obtaining a 1-spanner in this setting is straightforward, already for five points such a spanner has no plane embedding with the leftmost and rightmost point on the outer face. This leads to restricting to oriented graphs with a one-page book embedding on the one-dimensional point set. For this case we present a dynamic program to compute the graph of minimum oriented dilation that runs in ?(n?) time for n points, and a greedy algorithm that computes a 5-spanner in ?(nlog n) time.
Expanding these results finally gives us a result for two-dimensional point sets: we prove that for convex point sets the greedy triangulation results in an oriented ?(1)-spanner
- …