905 research outputs found

    On the spectrum of hypergraphs

    Full text link
    Here we study the spectral properties of an underlying weighted graph of a non-uniform hypergraph by introducing different connectivity matrices, such as adjacency, Laplacian and normalized Laplacian matrices. We show that different structural properties of a hypergrpah, can be well studied using spectral properties of these matrices. Connectivity of a hypergraph is also investigated by the eigenvalues of these operators. Spectral radii of the same are bounded by the degrees of a hypergraph. The diameter of a hypergraph is also bounded by the eigenvalues of its connectivity matrices. We characterize different properties of a regular hypergraph characterized by the spectrum. Strong (vertex) chromatic number of a hypergraph is bounded by the eigenvalues. Cheeger constant on a hypergraph is defined and we show that it can be bounded by the smallest nontrivial eigenvalues of Laplacian matrix and normalized Laplacian matrix, respectively, of a connected hypergraph. We also show an approach to study random walk on a (non-uniform) hypergraph that can be performed by analyzing the spectrum of transition probability operator which is defined on that hypergraph. Ricci curvature on hypergraphs is introduced in two different ways. We show that if the Laplace operator, Δ\Delta, on a hypergraph satisfies a curvature-dimension type inequality CD(m,K)CD (\mathbf{m}, \mathbf{K}) with m>1\mathbf{m}>1 and K>0\mathbf{K}>0 then any non-zero eigenvalue of Δ- \Delta can be bounded below by mKm1 \frac{ \mathbf{m} \mathbf{K}}{ \mathbf{m} -1 } . Eigenvalues of a normalized Laplacian operator defined on a connected hypergraph can be bounded by the Ollivier's Ricci curvature of the hypergraph

    Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels

    Full text link
    Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labelled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimising a graph-Dirichlet energy, equivalently a discrete Sobolev H1\mathrm{H}^1 semi-norm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimisers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimisers are not continuous. One solution is to consider higher-order regularisation, which is the analogue of minimising Sobolev Hs\mathrm{H}^s semi-norms. In this paper we consider the asymptotics of minimising a graph variant of the Sobolev Hs\mathrm{H}^s semi-norm with pointwise constraints. We show that, as expected, one needs s>d/2s>d/2 where dd is the dimension of the data manifold. We also show that there must be a upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behaviour of the minimiser even when s>d/2s>d/2.Comment: 37 pages, 4 figure
    corecore