We introduce a new approach for training named-entity pair embeddings to improve relation extraction performance in the biomedical domain. These embeddings are trained in
an unsupervised manner, based on the principles of distributional
semantics. By adding them to neural network architectures, we show that
improved F-Scores are achieved. Our best performing neural model which
utilizes entity-pair embeddings along with a pre-trained BERT encoder, achieves an F-score of 77.19 on CHEMPROT (Chemical-Protein) relation extraction corpus, setting a new state-of-the-art result for the task.</p