24 research outputs found

    On The Effect of Hyperedge Weights On Hypergraph Learning

    Full text link
    Hypergraph is a powerful representation in several computer vision, machine learning and pattern recognition problems. In the last decade, many researchers have been keen to develop different hypergraph models. In contrast, no much attention has been paid to the design of hyperedge weights. However, many studies on pairwise graphs show that the choice of edge weight can significantly influence the performances of such graph algorithms. We argue that this also applies to hypegraphs. In this paper, we empirically discuss the influence of hyperedge weight on hypegraph learning via proposing three novel hyperedge weights from the perspectives of geometry, multivariate statistical analysis and linear regression. Extensive experiments on ORL, COIL20, JAFFE, Sheffield, Scene15 and Caltech256 databases verify our hypothesis. Similar to graph learning, several representative hyperedge weighting schemes can be concluded by our experimental studies. Moreover, the experiments also demonstrate that the combinations of such weighting schemes and conventional hypergraph models can get very promising classification and clustering performances in comparison with some recent state-of-the-art algorithms

    Hypergraph Learning with Line Expansion

    Full text link
    Previous hypergraph expansions are solely carried out on either vertex level or hyperedge level, thereby missing the symmetric nature of data co-occurrence, and resulting in information loss. To address the problem, this paper treats vertices and hyperedges equally and proposes a new hypergraph formulation named the \emph{line expansion (LE)} for hypergraphs learning. The new expansion bijectively induces a homogeneous structure from the hypergraph by treating vertex-hyperedge pairs as "line nodes". By reducing the hypergraph to a simple graph, the proposed \emph{line expansion} makes existing graph learning algorithms compatible with the higher-order structure and has been proven as a unifying framework for various hypergraph expansions. We evaluate the proposed line expansion on five hypergraph datasets, the results show that our method beats SOTA baselines by a significant margin

    On the spectrum of hypergraphs

    Full text link
    Here we study the spectral properties of an underlying weighted graph of a non-uniform hypergraph by introducing different connectivity matrices, such as adjacency, Laplacian and normalized Laplacian matrices. We show that different structural properties of a hypergrpah, can be well studied using spectral properties of these matrices. Connectivity of a hypergraph is also investigated by the eigenvalues of these operators. Spectral radii of the same are bounded by the degrees of a hypergraph. The diameter of a hypergraph is also bounded by the eigenvalues of its connectivity matrices. We characterize different properties of a regular hypergraph characterized by the spectrum. Strong (vertex) chromatic number of a hypergraph is bounded by the eigenvalues. Cheeger constant on a hypergraph is defined and we show that it can be bounded by the smallest nontrivial eigenvalues of Laplacian matrix and normalized Laplacian matrix, respectively, of a connected hypergraph. We also show an approach to study random walk on a (non-uniform) hypergraph that can be performed by analyzing the spectrum of transition probability operator which is defined on that hypergraph. Ricci curvature on hypergraphs is introduced in two different ways. We show that if the Laplace operator, Δ\Delta, on a hypergraph satisfies a curvature-dimension type inequality CD(m,K)CD (\mathbf{m}, \mathbf{K}) with m>1\mathbf{m}>1 and K>0\mathbf{K}>0 then any non-zero eigenvalue of Δ- \Delta can be bounded below by mKm1 \frac{ \mathbf{m} \mathbf{K}}{ \mathbf{m} -1 } . Eigenvalues of a normalized Laplacian operator defined on a connected hypergraph can be bounded by the Ollivier's Ricci curvature of the hypergraph

    Hypergraph pp-Laplacian: A Differential Geometry View

    Full text link
    The graph Laplacian plays key roles in information processing of relational data, and has analogies with the Laplacian in differential geometry. In this paper, we generalize the analogy between graph Laplacian and differential geometry to the hypergraph setting, and propose a novel hypergraph pp-Laplacian. Unlike the existing two-node graph Laplacians, this generalization makes it possible to analyze hypergraphs, where the edges are allowed to connect any number of nodes. Moreover, we propose a semi-supervised learning method based on the proposed hypergraph pp-Laplacian, and formalize them as the analogue to the Dirichlet problem, which often appears in physics. We further explore theoretical connections to normalized hypergraph cut on a hypergraph, and propose normalized cut corresponding to hypergraph pp-Laplacian. The proposed pp-Laplacian is shown to outperform standard hypergraph Laplacians in the experiment on a hypergraph semi-supervised learning and normalized cut setting.Comment: Extended version of our AAAI-18 pape

    Balanced Coarsening for Multilevel Hypergraph Partitioning via Wasserstein Discrepancy

    Full text link
    We propose a balanced coarsening scheme for multilevel hypergraph partitioning. In addition, an initial partitioning algorithm is designed to improve the quality of k-way hypergraph partitioning. By assigning vertex weights through the LPT algorithm, we generate a prior hypergraph under a relaxed balance constraint. With the prior hypergraph, we have defined the Wasserstein discrepancy to coordinate the optimal transport of coarsening process. And the optimal transport matrix is solved by Sinkhorn algorithm. Our coarsening scheme fully takes into account the minimization of connectivity metric (objective function). For the initial partitioning stage, we define a normalized cut function induced by Fiedler vector, which is theoretically proved to be a concave function. Thereby, a three-point algorithm is designed to find the best cut under the balance constraint

    View-aligned hypergraph learning for Alzheimer’s disease diagnosis with incomplete multi-modality data

    Get PDF
    AbstractEffectively utilizing incomplete multi-modality data for the diagnosis of Alzheimer's disease (AD) and its prodrome (i.e., mild cognitive impairment, MCI) remains an active area of research. Several multi-view learning methods have been recently developed for AD/MCI diagnosis by using incomplete multi-modality data, with each view corresponding to a specific modality or a combination of several modalities. However, existing methods usually ignore the underlying coherence among views, which may lead to sub-optimal learning performance. In this paper, we propose a view-aligned hypergraph learning (VAHL) method to explicitly model the coherence among views. Specifically, we first divide the original data into several views based on the availability of different modalities and then construct a hypergraph in each view space based on sparse representation. A view-aligned hypergraph classification (VAHC) model is then proposed, by using a view-aligned regularizer to capture coherence among views. We further assemble the class probability scores generated from VAHC, via a multi-view label fusion method for making a final classification decision. We evaluate our method on the baseline ADNI-1 database with 807 subjects and three modalities (i.e., MRI, PET, and CSF). Experimental results demonstrate that our method outperforms state-of-the-art methods that use incomplete multi-modality data for AD/MCI diagnosis
    corecore