We consider a contrastive learning approach to knowledge graph embedding
(KGE) via InfoNCE. For KGE, efficient learning relies on augmenting the
training data with negative triples. However, most KGE works overlook the bias
from generating the negative triples-false negative triples (factual triples
missing from the knowledge graph). We argue that the generation of high-quality
(i.e., hard) negative triples might lead to an increase in false negative
triples. To mitigate the impact of false negative triples during the generation
of hard negative triples, we propose the Hardness and Structure-aware
(\textbf{HaSa}) contrastive KGE method, which alleviates the effect of false
negative triples while generating the hard negative triples. Experiments show
that HaSa improves the performance of InfoNCE-based KGE approaches and achieves
state-of-the-art results in several metrics for WN18RR datasets and competitive
results for FB15k-237 datasets compared to both classic and pre-trained
LM-based KGE methods