Learning about diagnostic features and related clinical information from
dental radiographs is important for dental research. However, the lack of
expert-annotated data and convenient search tools poses challenges. Our primary
objective is to design a search tool that uses a user's query for oral-related
research. The proposed framework, Contrastive LAnguage Image REtrieval Search
for dental research, Dental CLAIRES, utilizes periapical radiographs and
associated clinical details such as periodontal diagnosis, demographic
information to retrieve the best-matched images based on the text query. We
applied a contrastive representation learning method to find images described
by the user's text by maximizing the similarity score of positive pairs (true
pairs) and minimizing the score of negative pairs (random pairs). Our model
achieved a hit@3 ratio of 96% and a Mean Reciprocal Rank (MRR) of 0.82. We also
designed a graphical user interface that allows researchers to verify the
model's performance with interactions.Comment: 10 pages, 7 figures, 4 table