905 research outputs found
On the spectrum of hypergraphs
Here we study the spectral properties of an underlying weighted graph of a
non-uniform hypergraph by introducing different connectivity matrices, such as
adjacency, Laplacian and normalized Laplacian matrices. We show that different
structural properties of a hypergrpah, can be well studied using spectral
properties of these matrices. Connectivity of a hypergraph is also investigated
by the eigenvalues of these operators. Spectral radii of the same are bounded
by the degrees of a hypergraph. The diameter of a hypergraph is also bounded by
the eigenvalues of its connectivity matrices. We characterize different
properties of a regular hypergraph characterized by the spectrum. Strong
(vertex) chromatic number of a hypergraph is bounded by the eigenvalues.
Cheeger constant on a hypergraph is defined and we show that it can be bounded
by the smallest nontrivial eigenvalues of Laplacian matrix and normalized
Laplacian matrix, respectively, of a connected hypergraph. We also show an
approach to study random walk on a (non-uniform) hypergraph that can be
performed by analyzing the spectrum of transition probability operator which is
defined on that hypergraph. Ricci curvature on hypergraphs is introduced in two
different ways. We show that if the Laplace operator, , on a hypergraph
satisfies a curvature-dimension type inequality
with and then any non-zero eigenvalue of can be bounded below by . Eigenvalues of a normalized Laplacian operator defined on a connected
hypergraph can be bounded by the Ollivier's Ricci curvature of the hypergraph
Recommended from our members
Mini-Workshop: Discrete p-Laplacians: Spectral Theory and Variational Methods in Mathematics and Computer Science
The p-Laplacian operators have a rich analytical theory and in the last few years they have also offered efficient tools to tackle several tasks in machine learning. During the workshop mathematicians and theoretical computer scientists working on models based on p-Laplacians on graphs and manifolds have presented the latest theoretical developments and have shared their knowledge
Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels
Laplace learning is a popular machine learning algorithm for finding missing
labels from a small number of labelled feature vectors using the geometry of a
graph. More precisely, Laplace learning is based on minimising a
graph-Dirichlet energy, equivalently a discrete Sobolev
semi-norm, constrained to taking the values of known labels on a given subset.
The variational problem is asymptotically ill-posed as the number of unlabeled
feature vectors goes to infinity for finite given labels due to a lack of
regularity in minimisers of the continuum Dirichlet energy in any dimension
higher than one. In particular, continuum minimisers are not continuous. One
solution is to consider higher-order regularisation, which is the analogue of
minimising Sobolev semi-norms. In this paper we consider the
asymptotics of minimising a graph variant of the Sobolev
semi-norm with pointwise constraints. We show that, as expected, one needs
where is the dimension of the data manifold. We also show that
there must be a upper bound on the connectivity of the graph; that is, highly
connected graphs lead to degenerate behaviour of the minimiser even when
.Comment: 37 pages, 4 figure
- …