417 research outputs found
Quaternion Graph Neural Networks
Recently, graph neural networks (GNNs) become a principal research direction
to learn low-dimensional continuous embeddings of nodes and graphs to predict
node and graph labels, respectively. However, Euclidean embeddings have high
distortion when using GNNs to model complex graphs such as social networks.
Furthermore, existing GNNs are not very efficient with the high number of model
parameters when increasing the number of hidden layers. Therefore, we move
beyond the Euclidean space to a hyper-complex vector space to improve graph
representation quality and reduce the number of model parameters. To this end,
we propose quaternion graph neural networks (QGNN) to generalize GCNs within
the Quaternion space to learn quaternion embeddings for nodes and graphs. The
Quaternion space, a hyper-complex vector space, provides highly meaningful
computations through Hamilton product compared to the Euclidean and complex
vector spaces. As a result, our QGNN can reduce the model size up to four times
and enhance learning better graph representations. Experimental results show
that the proposed QGNN produces state-of-the-art accuracies on a range of
well-known benchmark datasets for three downstream tasks, including graph
classification, semi-supervised node classification, and text (node)
classification. Our code is available at: https://github.com/daiquocnguyen/QGNNComment: The extended abstract has been accepted to NeurIPS 2020 Workshop on
Differential Geometry meets Deep Learning (DiffGeo4DL). The code in Pytorch
and Tensorflow is available at: https://github.com/daiquocnguyen/QGN
Two-view Graph Neural Networks for Knowledge Graph Completion
We present an effective GNN-based knowledge graph embedding model, named WGE,
to capture entity- and relation-focused graph structures. In particular, given
the knowledge graph, WGE builds a single undirected entity-focused graph that
views entities as nodes. In addition, WGE also constructs another single
undirected graph from relation-focused constraints, which views entities and
relations as nodes. WGE then proposes a GNN-based architecture to better learn
vector representations of entities and relations from these two single entity-
and relation-focused graphs. WGE feeds the learned entity and relation
representations into a weighted score function to return the triple scores for
knowledge graph completion. Experimental results show that WGE outperforms
competitive baselines, obtaining state-of-the-art performances on seven
benchmark datasets for knowledge graph completion.Comment: 13 pages; 3 tables; 3 figure
- …