Skip to main content
Article thumbnail
Location of Repository

An Uncertainty-aware Query Selection Model for Evaluation of IR Systems

By Mehdi Hosseini, Ingemar J. Cox, Nataša Milić-frayling, Milad Shokouhi and Emine Yilmaz

Abstract

We propose a mathematical framework for query selection as a mechanism for reducing the cost of constructing information retrieval test collections. In particular, our mathematical formulation explicitly models the uncertainty in the retrieval effectiveness metrics that is introduced by the absence of relevance judgments. Since the optimization problem is computationally intractable, we devise an adaptive query selection algorithm, referred to as Adaptive, that provides an approximate solution. Adaptive selects queries iteratively and assumes that no relevance judgments are available for the query under consideration. Once a query is selected, the associated relevance assessments are acquired and then used to aid the selection of subsequent queries. We demonstrate the effectiveness of the algorithm on two TREC test collections as well as a test collection of an online search engine with 1000 queries. Our experimental results show that the queries chosen by Adaptive produce reliable performance ranking of systems. The ranking is better correlated with the actual systems ranking than the rankings produced by queries that were selected using the considered baseline methods

Topics: Measurements, Algorithms, Theory and Performance Keywords Information Retrieval, Test Collection, Query Selection
Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.374.1343
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www0.cs.ucl.ac.uk/staff... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.