2 research outputs found
On Filter Size in Graph Convolutional Networks
Recently, many researchers have been focusing on the definition of neural
networks for graphs. The basic component for many of these approaches remains
the graph convolution idea proposed almost a decade ago. In this paper, we
extend this basic component, following an intuition derived from the well-known
convolutional filters over multi-dimensional tensors. In particular, we derive
a simple, efficient and effective way to introduce a hyper-parameter on graph
convolutions that influences the filter size, i.e. its receptive field over the
considered graph. We show with experimental results on real-world graph
datasets that the proposed graph convolutional filter improves the predictive
performance of Deep Graph Convolutional Networks.Comment: arXiv admin note: text overlap with arXiv:1811.0693
Graph Kernels Exploiting Weisfeiler-Lehman Graph Isomorphism Test Extensions
In this paper we present a novel graph kernel framework inspired the by the Weisfeiler-Lehman (WL) isomorphism tests. Any WL test comprises a relabelling phase of the nodes based on test-specific information extracted from the graph, for example the set of neighbours of a node. We defined a novel relabelling and derived two kernels of the framework from it. The novel kernels are very fast to compute and achieve state-of-the-art results on five real-world datasets