3 research outputs found
Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation
Dynamic graph representation learning strategies are based on different
neural architectures to capture the graph evolution over time. However, the
underlying neural architectures require a large amount of parameters to train
and suffer from high online inference latency, that is several model parameters
have to be updated when new data arrive online. In this study we propose
Distill2Vec, a knowledge distillation strategy to train a compact model with a
low number of trainable parameters, so as to reduce the latency of online
inference and maintain the model accuracy high. We design a distillation loss
function based on Kullback-Leibler divergence to transfer the acquired
knowledge from a teacher model trained on offline data, to a small-size student
model for online data. Our experiments with publicly available datasets show
the superiority of our proposed model over several state-of-the-art approaches
with relative gains up to 5% in the link prediction task. In addition, we
demonstrate the effectiveness of our knowledge distillation strategy, in terms
of number of required parameters, where Distill2Vec achieves a compression
ratio up to 7:100 when compared with baseline approaches. For reproduction
purposes, our implementation is publicly available at
https://stefanosantaris.github.io/Distill2Vec
A Survey on Dynamic Network Embedding
Real-world networks are composed of diverse interacting and evolving
entities, while most of existing researches simply characterize them as
particular static networks, without consideration of the evolution trend in
dynamic networks. Recently, significant progresses in tracking the properties
of dynamic networks have been made, which exploit changes of entities and links
in the network to devise network embedding techniques. Compared to widely
proposed static network embedding methods, dynamic network embedding endeavors
to encode nodes as low-dimensional dense representations that effectively
preserve the network structures and the temporal dynamics, which is beneficial
to multifarious downstream machine learning tasks. In this paper, we conduct a
systematical survey on dynamic network embedding. In specific, basic concepts
of dynamic network embedding are described, notably, we propose a novel
taxonomy of existing dynamic network embedding techniques for the first time,
including matrix factorization based, Skip-Gram based, autoencoder based,
neural networks based and other embedding methods. Additionally, we carefully
summarize the commonly used datasets and a wide variety of subsequent tasks
that dynamic network embedding can benefit. Afterwards and primarily, we
suggest several challenges that the existing algorithms faced and outline
possible directions to facilitate the future research, such as dynamic
embedding models, large-scale dynamic networks, heterogeneous dynamic networks,
dynamic attributed networks, task-oriented dynamic network embedding and more
embedding spaces.Comment: 25 page
A Survey on Embedding Dynamic Graphs
Embedding static graphs in low-dimensional vector spaces plays a key role in
network analytics and inference, supporting applications like node
classification, link prediction, and graph visualization. However, many
real-world networks present dynamic behavior, including topological evolution,
feature evolution, and diffusion. Therefore, several methods for embedding
dynamic graphs have been proposed to learn network representations over time,
facing novel challenges, such as time-domain modeling, temporal features to be
captured, and the temporal granularity to be embedded. In this survey, we
overview dynamic graph embedding, discussing its fundamentals and the recent
advances developed so far. We introduce the formal definition of dynamic graph
embedding, focusing on the problem setting and introducing a novel taxonomy for
dynamic graph embedding input and output. We further explore different dynamic
behaviors that may be encompassed by embeddings, classifying by topological
evolution, feature evolution, and processes on networks. Afterward, we describe
existing techniques and propose a taxonomy for dynamic graph embedding
techniques based on algorithmic approaches, from matrix and tensor
factorization to deep learning, random walks, and temporal point processes. We
also elucidate main applications, including dynamic link prediction, anomaly
detection, and diffusion prediction, and we further state some promising
research directions in the area.Comment: 41 pages, 10 figure