Graph neural networks (GNNs) for temporal graphs have recently attracted
increasing attentions, where a common assumption is that the class set for
nodes is closed. However, in real-world scenarios, it often faces the open set
problem with the dynamically increased class set as the time passes by. This
will bring two big challenges to the existing dynamic GNN methods: (i) How to
dynamically propagate appropriate information in an open temporal graph, where
new class nodes are often linked to old class nodes. This case will lead to a
sharp contradiction. This is because typical GNNs are prone to make the
embeddings of connected nodes become similar, while we expect the embeddings of
these two interactive nodes to be distinguishable since they belong to
different classes. (ii) How to avoid catastrophic knowledge forgetting over old
classes when learning new classes occurred in temporal graphs. In this paper,
we propose a general and principled learning approach for open temporal graphs,
called OTGNet, with the goal of addressing the above two challenges. We assume
the knowledge of a node can be disentangled into class-relevant and
class-agnostic one, and thus explore a new message passing mechanism by
extending the information bottleneck principle to only propagate class-agnostic
knowledge between nodes of different classes, avoiding aggregating conflictive
information. Moreover, we devise a strategy to select both important and
diverse triad sub-graph structures for effective class-incremental learning.
Extensive experiments on three real-world datasets of different domains
demonstrate the superiority of our method, compared to the baselines.Comment: ICLR 2023 Ora