ii Information retrieval and feedback in XML are rather new fields for researchers; natural questions arise, such as: how good are the feedback algorithms in XML IR? Can they be evaluated with standard evaluation tools? Even though some evaluation methods have been proposed in the literature, it is still not clear yet which of them are applicable in the context of XML IR, and which metrics they can be combined with to assess the quality of XML retrieval algorithms that use feedback. We propose a solution for fairly evaluating the performance of the XML search engines that use feedback for improving the query results. Compared to previous approaches, we aim at removing the effect of the results for which the system has knowledge about their the relevance, and at measuring the improvement on unseen relevant elements. We implemented our proposed evaluation methodologies by extending
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.