9,913 research outputs found
Theoretically Expressive and Edge-aware Graph Learning
We propose a new Graph Neural Network that combines recent advancements in
the field. We give theoretical contributions by proving that the model is
strictly more general than the Graph Isomorphism Network and the Gated Graph
Neural Network, as it can approximate the same functions and deal with
arbitrary edge values. Then, we show how a single node information can flow
through the graph unchanged
Relation Embedding based Graph Neural Networks for Handling Heterogeneous Graph
Heterogeneous graph learning has drawn significant attentions in recent
years, due to the success of graph neural networks (GNNs) and the broad
applications of heterogeneous information networks. Various heterogeneous graph
neural networks have been proposed to generalize GNNs for processing the
heterogeneous graphs. Unfortunately, these approaches model the heterogeneity
via various complicated modules. This paper aims to propose a simple yet
efficient framework to make the homogeneous GNNs have adequate ability to
handle heterogeneous graphs. Specifically, we propose Relation Embedding based
Graph Neural Networks (RE-GNNs), which employ only one parameter per relation
to embed the importance of edge type relations and self-loop connections. To
optimize these relation embeddings and the other parameters simultaneously, a
gradient scaling factor is proposed to constrain the embeddings to converge to
suitable values. Besides, we theoretically demonstrate that our RE-GNNs have
more expressive power than the meta-path based heterogeneous GNNs. Extensive
experiments on the node classification tasks validate the effectiveness of our
proposed method
Sampled in Pairs and Driven by Text: A New Graph Embedding Framework
In graphs with rich texts, incorporating textual information with structural
information would benefit constructing expressive graph embeddings. Among
various graph embedding models, random walk (RW)-based is one of the most
popular and successful groups. However, it is challenged by two issues when
applied on graphs with rich texts: (i) sampling efficiency: deriving from the
training objective of RW-based models (e.g., DeepWalk and node2vec), we show
that RW-based models are likely to generate large amounts of redundant training
samples due to three main drawbacks. (ii) text utilization: these models have
difficulty in dealing with zero-shot scenarios where graph embedding models
have to infer graph structures directly from texts. To solve these problems, we
propose a novel framework, namely Text-driven Graph Embedding with Pairs
Sampling (TGE-PS). TGE-PS uses Pairs Sampling (PS) to improve the sampling
strategy of RW, being able to reduce ~99% training samples while preserving
competitive performance. TGE-PS uses Text-driven Graph Embedding (TGE), an
inductive graph embedding approach, to generate node embeddings from texts.
Since each node contains rich texts, TGE is able to generate high-quality
embeddings and provide reasonable predictions on existence of links to unseen
nodes. We evaluate TGE-PS on several real-world datasets, and experiment
results demonstrate that TGE-PS produces state-of-the-art results on both
traditional and zero-shot link prediction tasks.Comment: Accepted by WWW 2019 (The World Wide Web Conference. ACM, 2019
A Survey on Graph Representation Learning Methods
Graphs representation learning has been a very active research area in recent
years. The goal of graph representation learning is to generate graph
representation vectors that capture the structure and features of large graphs
accurately. This is especially important because the quality of the graph
representation vectors will affect the performance of these vectors in
downstream tasks such as node classification, link prediction and anomaly
detection. Many techniques are proposed for generating effective graph
representation vectors. Two of the most prevalent categories of graph
representation learning are graph embedding methods without using graph neural
nets (GNN), which we denote as non-GNN based graph embedding methods, and graph
neural nets (GNN) based methods. Non-GNN graph embedding methods are based on
techniques such as random walks, temporal point processes and neural network
learning methods. GNN-based methods, on the other hand, are the application of
deep learning on graph data. In this survey, we provide an overview of these
two categories and cover the current state-of-the-art methods for both static
and dynamic graphs. Finally, we explore some open and ongoing research
directions for future work
- …