3 research outputs found
Robust Large-Margin Learning in Hyperbolic Space
Recently, there has been a surge of interest in representation learning in
hyperbolic spaces, driven by their ability to represent hierarchical data with
significantly fewer dimensions than standard Euclidean spaces. However, the
viability and benefits of hyperbolic spaces for downstream machine learning
tasks have received less attention. In this paper, we present, to our
knowledge, the first theoretical guarantees for learning a classifier in
hyperbolic rather than Euclidean space. Specifically, we consider the problem
of learning a large-margin classifier for data possessing a hierarchical
structure. Our first contribution is a hyperbolic perceptron algorithm, which
provably converges to a separating hyperplane. We then provide an algorithm to
efficiently learn a large-margin hyperplane, relying on the careful injection
of adversarial examples. Finally, we prove that for hierarchical data that
embeds well into hyperbolic space, the low embedding dimension ensures superior
guarantees when learning the classifier directly in hyperbolic space.Comment: Accepted to NeurIPS 202
Hyperbolic Space with Hierarchical Margin Boosts Fine-Grained Learning from Coarse Labels
Learning fine-grained embeddings from coarse labels is a challenging task due
to limited label granularity supervision, i.e., lacking the detailed
distinctions required for fine-grained tasks. The task becomes even more
demanding when attempting few-shot fine-grained recognition, which holds
practical significance in various applications. To address these challenges, we
propose a novel method that embeds visual embeddings into a hyperbolic space
and enhances their discriminative ability with a hierarchical cosine margins
manner. Specifically, the hyperbolic space offers distinct advantages,
including the ability to capture hierarchical relationships and increased
expressive power, which favors modeling fine-grained objects. Based on the
hyperbolic space, we further enforce relatively large/small similarity margins
between coarse/fine classes, respectively, yielding the so-called hierarchical
cosine margins manner. While enforcing similarity margins in the regular
Euclidean space has become popular for deep embedding learning, applying it to
the hyperbolic space is non-trivial and validating the benefit for
coarse-to-fine generalization is valuable. Extensive experiments conducted on
five benchmark datasets showcase the effectiveness of our proposed method,
yielding state-of-the-art results surpassing competing methods.Comment: Accepted by NeurIPS 202
Machine learning in space forms: Embeddings, classification, and similarity comparisons
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding, classification, and similarity comparisons.
We first introduce kinetic Euclidean distance matrices to solve kinetic distance geometry problems. In distance geometry problems (DGPs), the task is to find a geometric representation, that is, an embedding, for a collection of entities consistent with pairwise distance (metric) or similarity (nonmetric) measurements. In kinetic DGPs, the twist is that the points are dynamic. And our goal is to localize them by exploiting the information about their trajectory class. We show that a semidefinite relaxation can reconstruct trajectories from incomplete, noisy, time-varying distance observations. We then introduce another distance-geometric object: hyperbolic distance matrices. Recent works have focused on hyperbolic embedding methods for low-distortion embedding of distance measurements associated with hierarchical data. We derive a semidefinite relaxation to estimate the missing distance measurements and denoise them. Further, we formalize the hyperbolic Procrustes analysis, which uses extraneous information in the form of anchor points, to uniquely identify the embedded points.
Next, we address the design of learning algorithms in mixed-curvature spaces. Learning algorithms in low-dimensional mixed-curvature spaces have been limited to certain non-Euclidean neural networks. Here, we study the problem of learning a linear classifier (a perceptron) in product of Euclidean, spherical, and hyperbolic spaces, i.e., space forms. We introduce a notion of linear separation surfaces in Riemannian manifolds and use a metric that renders distances in different space forms compatible with each other and integrates them into one classifier.
Lastly, we show how similarity comparisons carry information about the underlying space of geometric graphs. We introduce the ordinal spread of a distance list and relate it to the ordinal capacity of their underlying space, a notion that quantifies the space's ability to host extreme patterns in nonmetric measurements. Then, we use the distribution of random ordinal spread variables as a practical tool to identify the underlying space form