3 research outputs found

    Large-scale interactive exploratory visual search

    Get PDF
    Large scale visual search has been one of the challenging issues in the era of big data. It demands techniques that are not only highly effective and efficient but also allow users conveniently express their information needs and refine their intents. In this thesis, we focus on developing an exploratory framework for large scale visual search. We also develop a number of enabling techniques in this thesis, including compact visual content representation for scalable search, near duplicate video shot detection, and action based event detection. We propose a novel scheme for extremely low bit rate visual search, which sends compressed visual words consisting of vocabulary tree histogram and descriptor orientations rather than descriptors. Compact representation of video data is achieved through identifying keyframes of a video which can also help users comprehend visual content efficiently. We propose a novel Bag-of-Importance model for static video summarization. Near duplicate detection is one of the key issues for large scale visual search, since there exist a large number nearly identical images and videos. We propose an improved near-duplicate video shot detection approach for more effective shot representation. Event detection has been one of the solutions for bridging the semantic gap in visual search. We particular focus on human action centred event detection. We propose an enhanced sparse coding scheme to model human actions. Our proposed approach is able to significantly reduce computational cost while achieving recognition accuracy highly comparable to the state-of-the-art methods. At last, we propose an integrated solution for addressing the prime challenges raised from large-scale interactive visual search. The proposed system is also one of the first attempts for exploratory visual search. It provides users more robust results to satisfy their exploring experiences

    Exploratory Product Image Search With Circle-to-Search Interaction

    Full text link
    © 2014 IEEE. Exploratory search is emerging as a new form of information-seeking activity in the research community, which generally combines browsing and searching content together to help users gain additional knowledge and form accurate queries, thereby assisting the users with their seeking and investigation activities. However, there have been few attempts at addressing integrated exploratory search solutions when image browsing is incorporated into the exploring loop. In this paper, we investigate the challenges of understanding users' search interests from the product images being browsed and inferring their actual search intentions. We propose a novel interactive image exploring system for allowing users to lightly switch between browse and search processes, and naturally complete visual-based exploratory search tasks in an effective and efficient way. This system enables users to specify their visual search interests in product images by circling any visual objects in web pages, and then the system automatically infers users' underlying intent by analyzing the browsing context and by analyzing the same or similar product images obtained by large-scale image search technology. Users can then utilize the recommended queries to complete intent-specific exploratory tasks. The proposed solution is one of the first attempts to understand users' interests for a visual-based exploratory product search task by integrating the browse and search activities. We have evaluated our system performance based on five million product images. The evaluation study demonstrates that the proposed system provides accurate intent-driven search results and fast response to exploratory search demands compared with the conventional image search methods, and also, provides users with robust results to satisfy their exploring experience
    corecore