Graph contrastive learning (GCL) is able to learn augmentation-invariant representations from raw data and reduce the dependence on labeled data. In the field of recommendation systems, traditional GCL models become a potential solution to insufficient supervision signals by augmenting the user-item interaction graph and optimizing InfoNCE loss to learn user and item representations. However, existing GCL-based recommendation models are limited by dimensional collapse, causing the sub-optimal performance of recommendation models. To tackle this problem, we propose a Wasserstein Distance-based Graph Contrastive Learning model, namely WGCL. Specifically, we integrate the Wasserstein loss into contrastive learning-based recommendation models to align the user/item representations distribution with the isotropic Gaussian distribution, which makes the real distribution of representations more uniform, thereby alleviating dimensional collapse. In fact, Wasserstein loss measures the distinction between the real distribution of entities’ representations and the desired distribution of representations by computing the covariance of representations learned from the augmented views. As a result, Wasserstein distance metric not only enables the representations more uniformly distributed on the hypersphere, but also better preserves the original semantic information of entities. Extensive experiments conducted on three widely used datasets demonstrate that WGCL outperforms traditional recommendation models. Our code is released at https://github.com/Sodapease/WGC
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.