2 research outputs found

    Query Performance Analyser - a tool for bridging information retrieval research and instruction

    Get PDF
    Information retrieval experiments usually measure the average effectiveness of IR methods developed. The analysis of individual queries is neglected although test results may contain individual test topics where general findings do not hold. The paper argues that, for the real user of an IR system, the study of variation in results is even more important than averages. The Interactive Query Performance Analyser (QPA) for information retrieval systems is a tool for analysing and comparing the performance of individual queries. On top of a standard test collection, it gives an instant visualisation of the performance achieved in a given search topic by any user-generated query. In addition to experimental IR research, QPA can be used in user training to demonstrate the characteristics of and compare differences between IR systems and searching strategies. The experiences in applying the tool both in IR experiments and in IR instruction are reported. The need for bridging research and instruction is underlined

    A Novel Method For The Evaluation Of Boolean Query Effectiveness Across A Wide Operational Range

    No full text
    Traditional methods for the system-oriented evaluation of Boolean IR systems suffer from validity and reliability problems. Laboratory-based research neglects the searcher and studies suboptimal queries. Research on operational systems fails to make a distinction between searcher performance and system performance. This approach is neither capable of measuring performance at standard points of operation (e.g. across R0.0-R1.0)
    corecore