65,792 research outputs found
Dynamic Label Graph Matching for Unsupervised Video Re-identification
© 2017 IEEE. Label estimation is an important component in an unsupervised person re-identification (re-ID) system. This paper focuses on cross-camera label estimation, which can be subsequently used in feature learning to learn robust re-ID models. Specifically, we propose to construct a graph for samples in each camera, and then graph matching scheme is introduced for cross-camera labeling association. While labels directly output from existing graph matching methods may be noisy and inaccurate due to significant cross-camera variations, this paper propose a dynamic graph matching (DGM) method. DGM iteratively updates the image graph and the label estimation process by learning a better feature space with intermediate estimated labels. DGM is advantageous in two aspects: 1) the accuracy of estimated labels is improved significantly with the iterations; 2) DGM is robust to noisy initial training data. Extensive experiments conducted on three benchmarks including the large-scale MARS dataset show that DGM yields competitive performance to fully supervised baselines, and outperforms competing unsupervised learning methods.
Combating Bilateral Edge Noise for Robust Link Prediction
Although link prediction on graphs has achieved great success with the
development of graph neural networks (GNNs), the potential robustness under the
edge noise is still less investigated. To close this gap, we first conduct an
empirical study to disclose that the edge noise bilaterally perturbs both input
topology and target label, yielding severe performance degradation and
representation collapse. To address this dilemma, we propose an
information-theory-guided principle, Robust Graph Information Bottleneck
(RGIB), to extract reliable supervision signals and avoid representation
collapse. Different from the basic information bottleneck, RGIB further
decouples and balances the mutual dependence among graph topology, target
labels, and representation, building new learning objectives for robust
representation against the bilateral noise. Two instantiations, RGIB-SSL and
RGIB-REP, are explored to leverage the merits of different methodologies, i.e.,
self-supervised learning and data reparameterization, for implicit and explicit
data denoising, respectively. Extensive experiments on six datasets and three
GNNs with diverse noisy scenarios verify the effectiveness of our RGIB
instantiations. The code is publicly available at:
https://github.com/tmlr-group/RGIB.Comment: Accepted by NeurIPS 202
Time-aware Graph Structure Learning via Sequence Prediction on Temporal Graphs
Temporal Graph Learning, which aims to model the time-evolving nature of
graphs, has gained increasing attention and achieved remarkable performance
recently. However, in reality, graph structures are often incomplete and noisy,
which hinders temporal graph networks (TGNs) from learning informative
representations. Graph contrastive learning uses data augmentation to generate
plausible variations of existing data and learn robust representations.
However, rule-based augmentation approaches may be suboptimal as they lack
learnability and fail to leverage rich information from downstream tasks. To
address these issues, we propose a Time-aware Graph Structure Learning (TGSL)
approach via sequence prediction on temporal graphs, which learns better graph
structures for downstream tasks through adding potential temporal edges. In
particular, it predicts time-aware context embedding based on previously
observed interactions and uses the Gumble-Top-K to select the closest candidate
edges to this context embedding. Additionally, several candidate sampling
strategies are proposed to ensure both efficiency and diversity. Furthermore,
we jointly learn the graph structure and TGNs in an end-to-end manner and
perform inference on the refined graph. Extensive experiments on temporal link
prediction benchmarks demonstrate that TGSL yields significant gains for the
popular TGNs such as TGAT and GraphMixer, and it outperforms other contrastive
learning methods on temporal graphs. We will release the code in the future.Comment: 10 pages,4 figures,5 table
Robust Knowledge Adaptation for Dynamic Graph Neural Networks
Graph structured data often possess dynamic characters in nature, e.g., the
addition of links and nodes, in many real-world applications. Recent years have
witnessed the increasing attentions paid to dynamic graph neural networks for
modelling such graph data, where almost all the existing approaches assume that
when a new link is built, the embeddings of the neighbor nodes should be
updated by learning the temporal dynamics to propagate new information.
However, such approaches suffer from the limitation that if the node introduced
by a new connection contains noisy information, propagating its knowledge to
other nodes is not reliable and even leads to the collapse of the model. In
this paper, we propose AdaNet: a robust knowledge Adaptation framework via
reinforcement learning for dynamic graph neural Networks. In contrast to
previous approaches immediately updating the embeddings of the neighbor nodes
once adding a new link, AdaNet attempts to adaptively determine which nodes
should be updated because of the new link involved. Considering that the
decision whether to update the embedding of one neighbor node will have great
impact on other neighbor nodes, we thus formulate the selection of node update
as a sequence decision problem, and address this problem via reinforcement
learning. By this means, we can adaptively propagate knowledge to other nodes
for learning robust node embedding representations. To the best of our
knowledge, our approach constitutes the first attempt to explore robust
knowledge adaptation via reinforcement learning for dynamic graph neural
networks. Extensive experiments on three benchmark datasets demonstrate that
AdaNet achieves the state-of-the-art performance. In addition, we perform the
experiments by adding different degrees of noise into the dataset,
quantitatively and qualitatively illustrating the robustness of AdaNet.Comment: 14 pages, 6 figure
- …