Beyond questions: leveraging ColBERT for keyphrase search

Abstract

While question-like queries are gaining popularity, keyphrase search is still the cornerstone of web search and other specialised domains such as academic and professional search. However, current dense retrieval models often fail with keyphrase-like queries, primarily because they are mostly trained on question-like ones. This paper introduces a novel model that employs the ColBERT architecture to enhance document ranking for keyphrase queries. For that, given the lack of large keyphrase-based retrieval datasets, we first explore how Large Language Models can convert question-like queries into keyphrase format. Then, using those keyphrases, we train a keyphrase-based ColBERT ranker (ColBERTKP QD) to improve the performance when working with keyphrase queries. Furthermore, to make the model more flexible, allowing the use of both the question and keyphrase encoders depending on the query type, we investigate the feasibility of training only a keyphrase query encoder while keeping the document encoder weights static (ColBERTKP Q). We assess our proposals’ ranking performance using both automatically generated and manually annotated keyphrases. Our results reveal the potential of the late interaction architecture when working under the keyphrase search scenario. This study’s code and generated resources are available at https://github.com/JorgeGabin/ColBERTKP

    Similar works

    This paper was published in Enlighten.

    Having an issue?

    Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.