2 research outputs found

    How do Users Perceive Information: Analyzing user feedback while annotating textual units

    Get PDF
    ABSTRACT We describe an initial study of how participants perceive information when they categorize highlighted textual units within a document marked for a given information need. Our investigation explores how users look at different parts of the document and classify textual units within retrieved documents on 4-levels of relevance and importance. We compare how users classify different textual units within a document, and report mean and variance for different users across different topics. Further, we analyze and categorise the reasons provided by users while rating textual units within retrieved documents. This research shows some interesting observations regarding why some parts of the document are regarded as more relevant than others (e.g. it provides contextual information, contains background information) and which kind of information seems to be effective for satisfying the end users (e.g showing examples, providing facts) in a search task. This work is a part of our ongoing investigation into generation of effective surrogates and document summaries based on search topics and user interactions with information

    Promoting user engagement and learning in search tasks by effective document representation

    Get PDF
    Much research in information retrieval (IR) focuses on optimisation of the rank of relevant retrieval results for single shot ad hoc IR tasks. Relatively little research has been carried out on supporting and promoting user engagement within search tasks. We seek to improve user experience by use of enhanced document snippets to be presented during the search process to promote user engagement with retrieved information. The primary role of document snippets within search has traditionally been to indicate the potential relevance of retrieved items to the user’s information need. Beyond the relevance of an item, it is generally not possible to infer the contents of individual ranked results just by reading the current snippets. We hypothesise that the creation of richer document snippets and summaries, and effective presentation of this information to users will promote effective search and greater user engagement, and support emerging areas such as learning through search. We generate document summaries for a given query by extracting top relevant sentences from retrieved documents. Creation of these summaries goes beyond exist- ing snippet creation methods by comparing content between documents to take into account novelty when selecting content for inclusion in individual document sum- maries. Further, we investigate the readability of the generated summaries with the overall goal of generating snippets which not only help a user to identify document relevance, but are also designed to increase the user’s understanding and knowledge of a topic gained while inspecting the snippets. We perform a task-based user study to record the user’s interactions, search be- haviour and feedback to evaluate the effectiveness of our snippets using qualitative and quantitative measures. In our user study, we found that richer snippets generated in this work improved the user experience and topical knowledge, and helped users to learn about the topic effectively
    corecore