1,123 research outputs found
Slimness of graphs
Slimness of a graph measures the local deviation of its metric from a tree
metric. In a graph , a geodesic triangle with
is the union of three shortest
paths connecting these vertices. A geodesic triangle is
called -slim if for any vertex on any side the
distance from to is at most , i.e. each path
is contained in the union of the -neighborhoods of two others. A graph
is called -slim, if all geodesic triangles in are
-slim. The smallest value for which is -slim is
called the slimness of . In this paper, using the layering partition
technique, we obtain sharp bounds on slimness of such families of graphs as (1)
graphs with cluster-diameter of a layering partition of , (2)
graphs with tree-length , (3) graphs with tree-breadth , (4)
-chordal graphs, AT-free graphs and HHD-free graphs. Additionally, we show
that the slimness of every 4-chordal graph is at most 2 and characterize those
4-chordal graphs for which the slimness of every of its induced subgraph is at
most 1
Optimally fast incremental Manhattan plane embedding and planar tight span construction
We describe a data structure, a rectangular complex, that can be used to
represent hyperconvex metric spaces that have the same topology (although not
necessarily the same distance function) as subsets of the plane. We show how to
use this data structure to construct the tight span of a metric space given as
an n x n distance matrix, when the tight span is homeomorphic to a subset of
the plane, in time O(n^2), and to add a single point to a planar tight span in
time O(n). As an application of this construction, we show how to test whether
a given finite metric space embeds isometrically into the Manhattan plane in
time O(n^2), and add a single point to the space and re-test whether it has
such an embedding in time O(n).Comment: 39 pages, 15 figure
Flat rank of automorphism groups of buildings
The flat rank of a totally disconnected locally compact group G, denoted
flat-rk(G), is an invariant of the topological group structure of G. It is
defined thanks to a natural distance on the space of compact open subgroups of
G. For a topological Kac-Moody group G with Weyl group W, we derive the
inequalities: alg-rk(W)\le flat-rk(G)\le rk(|W|\_0). Here, alg-rk(W) is the
maximal -rank of abelian subgroups of W, and rk(|W|\_0) is the
maximal dimension of isometrically embedded flats in the CAT0-realization
|W|\_0. We can prove these inequalities under weaker assumptions. We also show
that for any integer n \geq 1 there is a topologically simple, compactly
generated, locally compact, totally disconnected group G, with flat-rk(G)=n and
which is not linear
Constructing Buildings and Harmonic Maps
In a continuation of our previous work, we outline a theory which should lead
to the construction of a universal pre-building and versal building with a
-harmonic map from a Riemann surface, in the case of two-dimensional
buildings for the group . This will provide a generalization of the space
of leaves of the foliation defined by a quadratic differential in the classical
theory for . Our conjectural construction would determine the exponents
for WKB problems, and it can be put into practice on examples.Comment: 61 pages, 24 figures. Comments are welcom
Geometric deep learning: going beyond Euclidean data
Many scientific fields study data with an underlying structure that is a
non-Euclidean space. Some examples include social networks in computational
social sciences, sensor networks in communications, functional networks in
brain imaging, regulatory networks in genetics, and meshed surfaces in computer
graphics. In many applications, such geometric data are large and complex (in
the case of social networks, on the scale of billions), and are natural targets
for machine learning techniques. In particular, we would like to use deep
neural networks, which have recently proven to be powerful tools for a broad
range of problems from computer vision, natural language processing, and audio
analysis. However, these tools have been most successful on data with an
underlying Euclidean or grid-like structure, and in cases where the invariances
of these structures are built into networks used to model them. Geometric deep
learning is an umbrella term for emerging techniques attempting to generalize
(structured) deep neural models to non-Euclidean domains such as graphs and
manifolds. The purpose of this paper is to overview different examples of
geometric deep learning problems and present available solutions, key
difficulties, applications, and future research directions in this nascent
field
- …