22,339 research outputs found
Redundancy-Free Self-Supervised Relational Learning for Graph Clustering
Graph clustering, which learns the node representations for effective cluster
assignments, is a fundamental yet challenging task in data analysis and has
received considerable attention accompanied by graph neural networks in recent
years. However, most existing methods overlook the inherent relational
information among the non-independent and non-identically distributed nodes in
a graph. Due to the lack of exploration of relational attributes, the semantic
information of the graph-structured data fails to be fully exploited which
leads to poor clustering performance. In this paper, we propose a novel
self-supervised deep graph clustering method named Relational Redundancy-Free
Graph Clustering (RFGC) to tackle the problem. It extracts the attribute-
and structure-level relational information from both global and local views
based on an autoencoder and a graph autoencoder. To obtain effective
representations of the semantic information, we preserve the consistent
relation among augmented nodes, whereas the redundant relation is further
reduced for learning discriminative embeddings. In addition, a simple yet valid
strategy is utilized to alleviate the over-smoothing issue. Extensive
experiments are performed on widely used benchmark datasets to validate the
superiority of our RFGC over state-of-the-art baselines. Our codes are
available at https://github.com/yisiyu95/R2FGC.Comment: Accepted by IEEE Transactions on Neural Networks and Learning Systems
(TNNLS 2024
End-to-end Learning for Graph Decomposition
We propose a novel end-to-end trainable framework for the graph decomposition
problem. The minimum cost multicut problem is first converted to an
unconstrained binary cubic formulation where cycle consistency constraints are
incorporated into the objective function. The new optimization problem can be
viewed as a Conditional Random Field (CRF) in which the random variables are
associated with the binary edge labels of the initial graph and the hard
constraints are introduced in the CRF as high-order potentials. The parameters
of a standard Neural Network and the fully differentiable CRF are optimized in
an end-to-end manner. Furthermore, our method utilizes the cycle constraints as
meta-supervisory signals during the learning of the deep feature
representations by taking the dependencies between the output random variables
into account. We present analyses of the end-to-end learned representations,
showing the impact of the joint training, on the task of clustering images of
MNIST. We also validate the effectiveness of our approach both for the feature
learning and the final clustering on the challenging task of real-world
multi-person pose estimation
Deepened Graph Auto-Encoders Help Stabilize and Enhance Link Prediction
Graph neural networks have been used for a variety of learning tasks, such as
link prediction, node classification, and node clustering. Among them, link
prediction is a relatively under-studied graph learning task, with current
state-of-the-art models based on one- or two-layer of shallow graph
auto-encoder (GAE) architectures. In this paper, we focus on addressing a
limitation of current methods for link prediction, which can only use shallow
GAEs and variational GAEs, and creating effective methods to deepen
(variational) GAE architectures to achieve stable and competitive performance.
Our proposed methods innovatively incorporate standard auto-encoders (AEs) into
the architectures of GAEs, where standard AEs are leveraged to learn essential,
low-dimensional representations via seamlessly integrating the adjacency
information and node features, while GAEs further build multi-scaled
low-dimensional representations via residual connections to learn a compact
overall embedding for link prediction. Empirically, extensive experiments on
various benchmarking datasets verify the effectiveness of our methods and
demonstrate the competitive performance of our deepened graph models for link
prediction. Theoretically, we prove that our deep extensions inclusively
express multiple polynomial filters with different orders.Comment: 10 page
- …