Skip to main content
Article thumbnail
Location of Repository

Second reviewer

By Silvana Solomon, Prof Dr. -ing, Gerhard Weikum, Dr. Ralf Schenkel, Prof Dr and Christoph Koch


ii Information retrieval and feedback in XML are rather new fields for researchers; natural questions arise, such as: how good are the feedback algorithms in XML IR? Can they be evaluated with standard evaluation tools? Even though some evaluation methods have been proposed in the literature, it is still not clear yet which of them are applicable in the context of XML IR, and which metrics they can be combined with to assess the quality of XML retrieval algorithms that use feedback. We propose a solution for fairly evaluating the performance of the XML search engines that use feedback for improving the query results. Compared to previous approaches, we aim at removing the effect of the results for which the system has knowledge about their the relevance, and at measuring the improvement on unseen relevant elements. We implemented our proposed evaluation methodologies by extending

Year: 2007
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.