Deep learning in computational biochemistry has traditionally focused on
molecular graphs neural representations; however, recent advances in language
models highlight how much scientific knowledge is encoded in text. To bridge
these two modalities, we investigate how molecular property information can be
transferred from natural language to graph representations. We study property
prediction performance gains after using contrastive learning to align neural
graph representations with representations of textual descriptions of their
characteristics. We implement neural relevance scoring strategies to improve
text retrieval, introduce a novel chemically-valid molecular graph augmentation
strategy inspired by organic reactions, and demonstrate improved performance on
downstream MoleculeNet property classification tasks. We achieve a +4.26% AUROC
gain versus models pre-trained on the graph modality alone, and a +1.54% gain
compared to recently proposed molecular graph/text contrastively trained MoMu
model (Su et al. 2022).Comment: 2023 ICML Workshop on Computational Biolog