38,276 research outputs found

    Laplacian Features for Learning with Hyperbolic Space

    Full text link
    Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and graph-structured data. As a result, various hyperbolic networks have been developed which outperform Euclidean networks on many tasks: e.g. hyperbolic graph convolutional networks (GCN) can outperform vanilla GCN on some graph learning tasks. However, most existing hyperbolic networks are complicated, computationally expensive, and numerically unstable -- and they cannot scale to large graphs due to these shortcomings. With more and more hyperbolic networks proposed, it is becoming less and less clear what key component is necessary to make the model behave. In this paper, we propose HyLa, a simple and minimal approach to using hyperbolic space in networks: HyLa maps once from a hyperbolic-space embedding to Euclidean space via the eigenfunctions of the Laplacian operator in the hyperbolic space. We evaluate HyLa on graph learning tasks including node classification and text classification, where HyLa can be used together with any graph neural networks. When used with a linear model, HyLa shows significant improvements over hyperbolic networks and other baselines

    Embedding Node Structural Role Identity into Hyperbolic Space

    Full text link
    Recently, there has been an interest in embedding networks in hyperbolic space, since hyperbolic space has been shown to work well in capturing graph/network structure as it can naturally reflect some properties of complex networks. However, the work on network embedding in hyperbolic space has been focused on microscopic node embedding. In this work, we are the first to present a framework to embed the structural roles of nodes into hyperbolic space. Our framework extends struct2vec, a well-known structural role preserving embedding method, by moving it to a hyperboloid model. We evaluated our method on four real-world and one synthetic network. Our results show that hyperbolic space is more effective than euclidean space in learning latent representations for the structural role of nodes.Comment: In Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM '20), October 19-23, 2020, Virtual Event, Irelan

    Scale-free network clustering in hyperbolic and other random graphs

    Full text link
    Random graphs with power-law degrees can model scale-free networks as sparse topologies with strong degree heterogeneity. Mathematical analysis of such random graphs proved successful in explaining scale-free network properties such as resilience, navigability and small distances. We introduce a variational principle to explain how vertices tend to cluster in triangles as a function of their degrees. We apply the variational principle to the hyperbolic model that quickly gains popularity as a model for scale-free networks with latent geometries and clustering. We show that clustering in the hyperbolic model is non-vanishing and self-averaging, so that a single random graph sample is a good representation in the large-network limit. We also demonstrate the variational principle for some classical random graphs including the preferential attachment model and the configuration model

    The Skeleton of Hyperbolic Graphs for Greedy Navigation

    Get PDF
    Random geometric (hyperbolic) graphs are impor-tant modeling tools in analyzing real-world complex networks.Greedy navigation (routing) is one of the most promising infor-mation forwarding mechanisms in complex networks. This paperis dealing with greedy navigability of complex graphs generatedby using a metric (hyperbolic) space. Greedy navigability meansthat every source-destination pairs in the graph can communicatein such a way that every node passes the information towards thatneighboring node which is ”closest” to the destination in terms ofnode coordinates in the metric space. A set of compulsory linksin greedy navigable graphs called Greedy Skeleton is identified.Because the two-dimensional hyperbolic plane (H2, also knownas the two dimensional Bolyai-Lobachevsky Space [2]) turnedout to be extremely useful in modelling and generating real-like networks, we deal with the statistical properties of theGreedy Skeleton when the metric space isH2. Some examples ofnumerical studies and simulation results supporting the analyticalformulae are also performed. The significance of the results liesin that every (either artificial or natural) network formationprocess aiming at greedy navigability must contain this GreedySkeleton. Furthermore, this could be an important step towardsthe formal argumentation of the very high greedy navigabilityof some models observed only experimentally for the time being,and also to analyze equilibrium of greedy network navigationgames onH2

    Hyperbolic Geometry of Complex Networks

    Full text link
    We develop a geometric framework to study the structure and function of complex networks. We assume that hyperbolic geometry underlies these networks, and we show that with this assumption, heterogeneous degree distributions and strong clustering in complex networks emerge naturally as simple reflections of the negative curvature and metric property of the underlying hyperbolic geometry. Conversely, we show that if a network has some metric structure, and if the network degree distribution is heterogeneous, then the network has an effective hyperbolic geometry underneath. We then establish a mapping between our geometric framework and statistical mechanics of complex networks. This mapping interprets edges in a network as non-interacting fermions whose energies are hyperbolic distances between nodes, while the auxiliary fields coupled to edges are linear functions of these energies or distances. The geometric network ensemble subsumes the standard configuration model and classical random graphs as two limiting cases with degenerate geometric structures. Finally, we show that targeted transport processes without global topology knowledge, made possible by our geometric framework, are maximally efficient, according to all efficiency measures, in networks with strongest heterogeneity and clustering, and that this efficiency is remarkably robust with respect to even catastrophic disturbances and damages to the network structure
    corecore